Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Modelling wildland fire burn severity in California using a spatial Super Learner approach (2311.16187v1)

Published 25 Nov 2023 in cs.LG and stat.AP

Abstract: Given the increasing prevalence of wildland fires in the Western US, there is a critical need to develop tools to understand and accurately predict burn severity. We develop a machine learning model to predict post-fire burn severity using pre-fire remotely sensed data. Hydrological, ecological, and topographical variables collected from four regions of California - the sites of the Kincade fire (2019), the CZU Lightning Complex fire (2020), the Windy fire (2021), and the KNP Fire (2021) - are used as predictors of the difference normalized burn ratio. We hypothesize that a Super Learner (SL) algorithm that accounts for spatial autocorrelation using Vecchia's Gaussian approximation will accurately model burn severity. In all combinations of test and training sets explored, the results of our model showed the SL algorithm outperformed standard Linear Regression methods. After fitting and verifying the performance of the SL model, we use interpretable machine learning tools to determine the main drivers of severe burn damage, including greenness, elevation and fire weather variables. These findings provide actionable insights that enable communities to strategize interventions, such as early fire detection systems, pre-fire season vegetation clearing activities, and resource allocation during emergency responses. When implemented, this model has the potential to minimize the loss of human life, property, resources, and ecosystems in California.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (48)
  1. Coen, J.L., Stavros, E.N., Fites-Kaufman, J.A.: Deconstructing the king megafire. Ecological Applications 28(6), 1565–1580 (2018). Accessed 2023-08-29 Coop et al. [2020] Coop, J., Parks, S., Stevens-Rumann, C., Crausbay, S., Higuera, P., Hurteau, M., Tepley, A., Whitman, E., Assal, T., Collins, B., Davis, K., Dobrowski, S., Falk, D., Fornwalt, P., Fulé, P., Harvey, B., Kane, V., Littlefield, C., Margolis, E., Rodman, K.: Wildfire-driven forest conversion in western north american landscapes. BioScience 70, 659–673 (2020) https://doi.org/10.1093/biosci/biaa061 Heaney et al. [2022] Heaney, A., Stowell, J.D., Liu, J.C., Basu, R., Marlier, M., Kinney, P.: Impacts of fine particulate matter from wildfire smoke on respiratory and cardiovascular health in california. GeoHealth 6(6) (2022) Keeley [2009] Keeley, J.E.: Fire intensity, fire severity and burn severity: A brief review and suggested usage. International Journal of Wildland Fire 18(1), 116 (2009) https://doi.org/10.1071/wf07049 Pascolini-Cambell et al. [2021] Pascolini-Cambell, M., Lee, C., Stavros, N., Fisher, J.B.: ECOSTRESS reveals pre-fire vegetation controls on burn severity for Southern California wildfires of 2020. Global Ecology and Biogeography 31, 1976–1989 (2021) Jensen et al. [2018] Jensen, D., Reager, J.T., Zajic, B., Rousseau, N., Rodell, M., Hinkley, E.: The sensitivity of US wildfire occurrence to pre-season soil moisture conditions across ecosystems. Environmental Research Letters 13(1) (2018) Fisher et al. [2020] Fisher, J.B., Lee, B., Purdy, A.J., Halverson, G.H., Dohlen, M.B., Cawse-Nicholson, K., Wang, A., Anderson, R.G., Aragon, B., Arain, M.A., et al.: Ecostress: Nasa’s next generation mission to measure evapotranspiration from the international space station. Water Resources Research 56(4), 2019–026058 (2020) Fisher et al. [2010] Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Coop, J., Parks, S., Stevens-Rumann, C., Crausbay, S., Higuera, P., Hurteau, M., Tepley, A., Whitman, E., Assal, T., Collins, B., Davis, K., Dobrowski, S., Falk, D., Fornwalt, P., Fulé, P., Harvey, B., Kane, V., Littlefield, C., Margolis, E., Rodman, K.: Wildfire-driven forest conversion in western north american landscapes. BioScience 70, 659–673 (2020) https://doi.org/10.1093/biosci/biaa061 Heaney et al. [2022] Heaney, A., Stowell, J.D., Liu, J.C., Basu, R., Marlier, M., Kinney, P.: Impacts of fine particulate matter from wildfire smoke on respiratory and cardiovascular health in california. GeoHealth 6(6) (2022) Keeley [2009] Keeley, J.E.: Fire intensity, fire severity and burn severity: A brief review and suggested usage. International Journal of Wildland Fire 18(1), 116 (2009) https://doi.org/10.1071/wf07049 Pascolini-Cambell et al. [2021] Pascolini-Cambell, M., Lee, C., Stavros, N., Fisher, J.B.: ECOSTRESS reveals pre-fire vegetation controls on burn severity for Southern California wildfires of 2020. Global Ecology and Biogeography 31, 1976–1989 (2021) Jensen et al. [2018] Jensen, D., Reager, J.T., Zajic, B., Rousseau, N., Rodell, M., Hinkley, E.: The sensitivity of US wildfire occurrence to pre-season soil moisture conditions across ecosystems. Environmental Research Letters 13(1) (2018) Fisher et al. [2020] Fisher, J.B., Lee, B., Purdy, A.J., Halverson, G.H., Dohlen, M.B., Cawse-Nicholson, K., Wang, A., Anderson, R.G., Aragon, B., Arain, M.A., et al.: Ecostress: Nasa’s next generation mission to measure evapotranspiration from the international space station. Water Resources Research 56(4), 2019–026058 (2020) Fisher et al. [2010] Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Heaney, A., Stowell, J.D., Liu, J.C., Basu, R., Marlier, M., Kinney, P.: Impacts of fine particulate matter from wildfire smoke on respiratory and cardiovascular health in california. GeoHealth 6(6) (2022) Keeley [2009] Keeley, J.E.: Fire intensity, fire severity and burn severity: A brief review and suggested usage. International Journal of Wildland Fire 18(1), 116 (2009) https://doi.org/10.1071/wf07049 Pascolini-Cambell et al. [2021] Pascolini-Cambell, M., Lee, C., Stavros, N., Fisher, J.B.: ECOSTRESS reveals pre-fire vegetation controls on burn severity for Southern California wildfires of 2020. Global Ecology and Biogeography 31, 1976–1989 (2021) Jensen et al. [2018] Jensen, D., Reager, J.T., Zajic, B., Rousseau, N., Rodell, M., Hinkley, E.: The sensitivity of US wildfire occurrence to pre-season soil moisture conditions across ecosystems. Environmental Research Letters 13(1) (2018) Fisher et al. [2020] Fisher, J.B., Lee, B., Purdy, A.J., Halverson, G.H., Dohlen, M.B., Cawse-Nicholson, K., Wang, A., Anderson, R.G., Aragon, B., Arain, M.A., et al.: Ecostress: Nasa’s next generation mission to measure evapotranspiration from the international space station. Water Resources Research 56(4), 2019–026058 (2020) Fisher et al. [2010] Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Keeley, J.E.: Fire intensity, fire severity and burn severity: A brief review and suggested usage. International Journal of Wildland Fire 18(1), 116 (2009) https://doi.org/10.1071/wf07049 Pascolini-Cambell et al. [2021] Pascolini-Cambell, M., Lee, C., Stavros, N., Fisher, J.B.: ECOSTRESS reveals pre-fire vegetation controls on burn severity for Southern California wildfires of 2020. Global Ecology and Biogeography 31, 1976–1989 (2021) Jensen et al. [2018] Jensen, D., Reager, J.T., Zajic, B., Rousseau, N., Rodell, M., Hinkley, E.: The sensitivity of US wildfire occurrence to pre-season soil moisture conditions across ecosystems. Environmental Research Letters 13(1) (2018) Fisher et al. [2020] Fisher, J.B., Lee, B., Purdy, A.J., Halverson, G.H., Dohlen, M.B., Cawse-Nicholson, K., Wang, A., Anderson, R.G., Aragon, B., Arain, M.A., et al.: Ecostress: Nasa’s next generation mission to measure evapotranspiration from the international space station. Water Resources Research 56(4), 2019–026058 (2020) Fisher et al. [2010] Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pascolini-Cambell, M., Lee, C., Stavros, N., Fisher, J.B.: ECOSTRESS reveals pre-fire vegetation controls on burn severity for Southern California wildfires of 2020. Global Ecology and Biogeography 31, 1976–1989 (2021) Jensen et al. [2018] Jensen, D., Reager, J.T., Zajic, B., Rousseau, N., Rodell, M., Hinkley, E.: The sensitivity of US wildfire occurrence to pre-season soil moisture conditions across ecosystems. Environmental Research Letters 13(1) (2018) Fisher et al. [2020] Fisher, J.B., Lee, B., Purdy, A.J., Halverson, G.H., Dohlen, M.B., Cawse-Nicholson, K., Wang, A., Anderson, R.G., Aragon, B., Arain, M.A., et al.: Ecostress: Nasa’s next generation mission to measure evapotranspiration from the international space station. Water Resources Research 56(4), 2019–026058 (2020) Fisher et al. [2010] Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Jensen, D., Reager, J.T., Zajic, B., Rousseau, N., Rodell, M., Hinkley, E.: The sensitivity of US wildfire occurrence to pre-season soil moisture conditions across ecosystems. Environmental Research Letters 13(1) (2018) Fisher et al. [2020] Fisher, J.B., Lee, B., Purdy, A.J., Halverson, G.H., Dohlen, M.B., Cawse-Nicholson, K., Wang, A., Anderson, R.G., Aragon, B., Arain, M.A., et al.: Ecostress: Nasa’s next generation mission to measure evapotranspiration from the international space station. Water Resources Research 56(4), 2019–026058 (2020) Fisher et al. [2010] Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, J.B., Lee, B., Purdy, A.J., Halverson, G.H., Dohlen, M.B., Cawse-Nicholson, K., Wang, A., Anderson, R.G., Aragon, B., Arain, M.A., et al.: Ecostress: Nasa’s next generation mission to measure evapotranspiration from the international space station. Water Resources Research 56(4), 2019–026058 (2020) Fisher et al. [2010] Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  2. Coop, J., Parks, S., Stevens-Rumann, C., Crausbay, S., Higuera, P., Hurteau, M., Tepley, A., Whitman, E., Assal, T., Collins, B., Davis, K., Dobrowski, S., Falk, D., Fornwalt, P., Fulé, P., Harvey, B., Kane, V., Littlefield, C., Margolis, E., Rodman, K.: Wildfire-driven forest conversion in western north american landscapes. BioScience 70, 659–673 (2020) https://doi.org/10.1093/biosci/biaa061 Heaney et al. [2022] Heaney, A., Stowell, J.D., Liu, J.C., Basu, R., Marlier, M., Kinney, P.: Impacts of fine particulate matter from wildfire smoke on respiratory and cardiovascular health in california. GeoHealth 6(6) (2022) Keeley [2009] Keeley, J.E.: Fire intensity, fire severity and burn severity: A brief review and suggested usage. International Journal of Wildland Fire 18(1), 116 (2009) https://doi.org/10.1071/wf07049 Pascolini-Cambell et al. [2021] Pascolini-Cambell, M., Lee, C., Stavros, N., Fisher, J.B.: ECOSTRESS reveals pre-fire vegetation controls on burn severity for Southern California wildfires of 2020. Global Ecology and Biogeography 31, 1976–1989 (2021) Jensen et al. [2018] Jensen, D., Reager, J.T., Zajic, B., Rousseau, N., Rodell, M., Hinkley, E.: The sensitivity of US wildfire occurrence to pre-season soil moisture conditions across ecosystems. Environmental Research Letters 13(1) (2018) Fisher et al. [2020] Fisher, J.B., Lee, B., Purdy, A.J., Halverson, G.H., Dohlen, M.B., Cawse-Nicholson, K., Wang, A., Anderson, R.G., Aragon, B., Arain, M.A., et al.: Ecostress: Nasa’s next generation mission to measure evapotranspiration from the international space station. Water Resources Research 56(4), 2019–026058 (2020) Fisher et al. [2010] Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Heaney, A., Stowell, J.D., Liu, J.C., Basu, R., Marlier, M., Kinney, P.: Impacts of fine particulate matter from wildfire smoke on respiratory and cardiovascular health in california. GeoHealth 6(6) (2022) Keeley [2009] Keeley, J.E.: Fire intensity, fire severity and burn severity: A brief review and suggested usage. International Journal of Wildland Fire 18(1), 116 (2009) https://doi.org/10.1071/wf07049 Pascolini-Cambell et al. [2021] Pascolini-Cambell, M., Lee, C., Stavros, N., Fisher, J.B.: ECOSTRESS reveals pre-fire vegetation controls on burn severity for Southern California wildfires of 2020. Global Ecology and Biogeography 31, 1976–1989 (2021) Jensen et al. [2018] Jensen, D., Reager, J.T., Zajic, B., Rousseau, N., Rodell, M., Hinkley, E.: The sensitivity of US wildfire occurrence to pre-season soil moisture conditions across ecosystems. Environmental Research Letters 13(1) (2018) Fisher et al. [2020] Fisher, J.B., Lee, B., Purdy, A.J., Halverson, G.H., Dohlen, M.B., Cawse-Nicholson, K., Wang, A., Anderson, R.G., Aragon, B., Arain, M.A., et al.: Ecostress: Nasa’s next generation mission to measure evapotranspiration from the international space station. Water Resources Research 56(4), 2019–026058 (2020) Fisher et al. [2010] Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Keeley, J.E.: Fire intensity, fire severity and burn severity: A brief review and suggested usage. International Journal of Wildland Fire 18(1), 116 (2009) https://doi.org/10.1071/wf07049 Pascolini-Cambell et al. [2021] Pascolini-Cambell, M., Lee, C., Stavros, N., Fisher, J.B.: ECOSTRESS reveals pre-fire vegetation controls on burn severity for Southern California wildfires of 2020. Global Ecology and Biogeography 31, 1976–1989 (2021) Jensen et al. [2018] Jensen, D., Reager, J.T., Zajic, B., Rousseau, N., Rodell, M., Hinkley, E.: The sensitivity of US wildfire occurrence to pre-season soil moisture conditions across ecosystems. Environmental Research Letters 13(1) (2018) Fisher et al. [2020] Fisher, J.B., Lee, B., Purdy, A.J., Halverson, G.H., Dohlen, M.B., Cawse-Nicholson, K., Wang, A., Anderson, R.G., Aragon, B., Arain, M.A., et al.: Ecostress: Nasa’s next generation mission to measure evapotranspiration from the international space station. Water Resources Research 56(4), 2019–026058 (2020) Fisher et al. [2010] Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pascolini-Cambell, M., Lee, C., Stavros, N., Fisher, J.B.: ECOSTRESS reveals pre-fire vegetation controls on burn severity for Southern California wildfires of 2020. Global Ecology and Biogeography 31, 1976–1989 (2021) Jensen et al. [2018] Jensen, D., Reager, J.T., Zajic, B., Rousseau, N., Rodell, M., Hinkley, E.: The sensitivity of US wildfire occurrence to pre-season soil moisture conditions across ecosystems. Environmental Research Letters 13(1) (2018) Fisher et al. [2020] Fisher, J.B., Lee, B., Purdy, A.J., Halverson, G.H., Dohlen, M.B., Cawse-Nicholson, K., Wang, A., Anderson, R.G., Aragon, B., Arain, M.A., et al.: Ecostress: Nasa’s next generation mission to measure evapotranspiration from the international space station. Water Resources Research 56(4), 2019–026058 (2020) Fisher et al. [2010] Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Jensen, D., Reager, J.T., Zajic, B., Rousseau, N., Rodell, M., Hinkley, E.: The sensitivity of US wildfire occurrence to pre-season soil moisture conditions across ecosystems. Environmental Research Letters 13(1) (2018) Fisher et al. [2020] Fisher, J.B., Lee, B., Purdy, A.J., Halverson, G.H., Dohlen, M.B., Cawse-Nicholson, K., Wang, A., Anderson, R.G., Aragon, B., Arain, M.A., et al.: Ecostress: Nasa’s next generation mission to measure evapotranspiration from the international space station. Water Resources Research 56(4), 2019–026058 (2020) Fisher et al. [2010] Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, J.B., Lee, B., Purdy, A.J., Halverson, G.H., Dohlen, M.B., Cawse-Nicholson, K., Wang, A., Anderson, R.G., Aragon, B., Arain, M.A., et al.: Ecostress: Nasa’s next generation mission to measure evapotranspiration from the international space station. Water Resources Research 56(4), 2019–026058 (2020) Fisher et al. [2010] Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  3. Heaney, A., Stowell, J.D., Liu, J.C., Basu, R., Marlier, M., Kinney, P.: Impacts of fine particulate matter from wildfire smoke on respiratory and cardiovascular health in california. GeoHealth 6(6) (2022) Keeley [2009] Keeley, J.E.: Fire intensity, fire severity and burn severity: A brief review and suggested usage. International Journal of Wildland Fire 18(1), 116 (2009) https://doi.org/10.1071/wf07049 Pascolini-Cambell et al. [2021] Pascolini-Cambell, M., Lee, C., Stavros, N., Fisher, J.B.: ECOSTRESS reveals pre-fire vegetation controls on burn severity for Southern California wildfires of 2020. Global Ecology and Biogeography 31, 1976–1989 (2021) Jensen et al. [2018] Jensen, D., Reager, J.T., Zajic, B., Rousseau, N., Rodell, M., Hinkley, E.: The sensitivity of US wildfire occurrence to pre-season soil moisture conditions across ecosystems. Environmental Research Letters 13(1) (2018) Fisher et al. [2020] Fisher, J.B., Lee, B., Purdy, A.J., Halverson, G.H., Dohlen, M.B., Cawse-Nicholson, K., Wang, A., Anderson, R.G., Aragon, B., Arain, M.A., et al.: Ecostress: Nasa’s next generation mission to measure evapotranspiration from the international space station. Water Resources Research 56(4), 2019–026058 (2020) Fisher et al. [2010] Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Keeley, J.E.: Fire intensity, fire severity and burn severity: A brief review and suggested usage. International Journal of Wildland Fire 18(1), 116 (2009) https://doi.org/10.1071/wf07049 Pascolini-Cambell et al. [2021] Pascolini-Cambell, M., Lee, C., Stavros, N., Fisher, J.B.: ECOSTRESS reveals pre-fire vegetation controls on burn severity for Southern California wildfires of 2020. Global Ecology and Biogeography 31, 1976–1989 (2021) Jensen et al. [2018] Jensen, D., Reager, J.T., Zajic, B., Rousseau, N., Rodell, M., Hinkley, E.: The sensitivity of US wildfire occurrence to pre-season soil moisture conditions across ecosystems. Environmental Research Letters 13(1) (2018) Fisher et al. [2020] Fisher, J.B., Lee, B., Purdy, A.J., Halverson, G.H., Dohlen, M.B., Cawse-Nicholson, K., Wang, A., Anderson, R.G., Aragon, B., Arain, M.A., et al.: Ecostress: Nasa’s next generation mission to measure evapotranspiration from the international space station. Water Resources Research 56(4), 2019–026058 (2020) Fisher et al. [2010] Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pascolini-Cambell, M., Lee, C., Stavros, N., Fisher, J.B.: ECOSTRESS reveals pre-fire vegetation controls on burn severity for Southern California wildfires of 2020. Global Ecology and Biogeography 31, 1976–1989 (2021) Jensen et al. [2018] Jensen, D., Reager, J.T., Zajic, B., Rousseau, N., Rodell, M., Hinkley, E.: The sensitivity of US wildfire occurrence to pre-season soil moisture conditions across ecosystems. Environmental Research Letters 13(1) (2018) Fisher et al. [2020] Fisher, J.B., Lee, B., Purdy, A.J., Halverson, G.H., Dohlen, M.B., Cawse-Nicholson, K., Wang, A., Anderson, R.G., Aragon, B., Arain, M.A., et al.: Ecostress: Nasa’s next generation mission to measure evapotranspiration from the international space station. Water Resources Research 56(4), 2019–026058 (2020) Fisher et al. [2010] Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Jensen, D., Reager, J.T., Zajic, B., Rousseau, N., Rodell, M., Hinkley, E.: The sensitivity of US wildfire occurrence to pre-season soil moisture conditions across ecosystems. Environmental Research Letters 13(1) (2018) Fisher et al. [2020] Fisher, J.B., Lee, B., Purdy, A.J., Halverson, G.H., Dohlen, M.B., Cawse-Nicholson, K., Wang, A., Anderson, R.G., Aragon, B., Arain, M.A., et al.: Ecostress: Nasa’s next generation mission to measure evapotranspiration from the international space station. Water Resources Research 56(4), 2019–026058 (2020) Fisher et al. [2010] Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, J.B., Lee, B., Purdy, A.J., Halverson, G.H., Dohlen, M.B., Cawse-Nicholson, K., Wang, A., Anderson, R.G., Aragon, B., Arain, M.A., et al.: Ecostress: Nasa’s next generation mission to measure evapotranspiration from the international space station. Water Resources Research 56(4), 2019–026058 (2020) Fisher et al. [2010] Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  4. Keeley, J.E.: Fire intensity, fire severity and burn severity: A brief review and suggested usage. International Journal of Wildland Fire 18(1), 116 (2009) https://doi.org/10.1071/wf07049 Pascolini-Cambell et al. [2021] Pascolini-Cambell, M., Lee, C., Stavros, N., Fisher, J.B.: ECOSTRESS reveals pre-fire vegetation controls on burn severity for Southern California wildfires of 2020. Global Ecology and Biogeography 31, 1976–1989 (2021) Jensen et al. [2018] Jensen, D., Reager, J.T., Zajic, B., Rousseau, N., Rodell, M., Hinkley, E.: The sensitivity of US wildfire occurrence to pre-season soil moisture conditions across ecosystems. Environmental Research Letters 13(1) (2018) Fisher et al. [2020] Fisher, J.B., Lee, B., Purdy, A.J., Halverson, G.H., Dohlen, M.B., Cawse-Nicholson, K., Wang, A., Anderson, R.G., Aragon, B., Arain, M.A., et al.: Ecostress: Nasa’s next generation mission to measure evapotranspiration from the international space station. Water Resources Research 56(4), 2019–026058 (2020) Fisher et al. [2010] Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pascolini-Cambell, M., Lee, C., Stavros, N., Fisher, J.B.: ECOSTRESS reveals pre-fire vegetation controls on burn severity for Southern California wildfires of 2020. Global Ecology and Biogeography 31, 1976–1989 (2021) Jensen et al. [2018] Jensen, D., Reager, J.T., Zajic, B., Rousseau, N., Rodell, M., Hinkley, E.: The sensitivity of US wildfire occurrence to pre-season soil moisture conditions across ecosystems. Environmental Research Letters 13(1) (2018) Fisher et al. [2020] Fisher, J.B., Lee, B., Purdy, A.J., Halverson, G.H., Dohlen, M.B., Cawse-Nicholson, K., Wang, A., Anderson, R.G., Aragon, B., Arain, M.A., et al.: Ecostress: Nasa’s next generation mission to measure evapotranspiration from the international space station. Water Resources Research 56(4), 2019–026058 (2020) Fisher et al. [2010] Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Jensen, D., Reager, J.T., Zajic, B., Rousseau, N., Rodell, M., Hinkley, E.: The sensitivity of US wildfire occurrence to pre-season soil moisture conditions across ecosystems. Environmental Research Letters 13(1) (2018) Fisher et al. [2020] Fisher, J.B., Lee, B., Purdy, A.J., Halverson, G.H., Dohlen, M.B., Cawse-Nicholson, K., Wang, A., Anderson, R.G., Aragon, B., Arain, M.A., et al.: Ecostress: Nasa’s next generation mission to measure evapotranspiration from the international space station. Water Resources Research 56(4), 2019–026058 (2020) Fisher et al. [2010] Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, J.B., Lee, B., Purdy, A.J., Halverson, G.H., Dohlen, M.B., Cawse-Nicholson, K., Wang, A., Anderson, R.G., Aragon, B., Arain, M.A., et al.: Ecostress: Nasa’s next generation mission to measure evapotranspiration from the international space station. Water Resources Research 56(4), 2019–026058 (2020) Fisher et al. [2010] Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  5. Pascolini-Cambell, M., Lee, C., Stavros, N., Fisher, J.B.: ECOSTRESS reveals pre-fire vegetation controls on burn severity for Southern California wildfires of 2020. Global Ecology and Biogeography 31, 1976–1989 (2021) Jensen et al. [2018] Jensen, D., Reager, J.T., Zajic, B., Rousseau, N., Rodell, M., Hinkley, E.: The sensitivity of US wildfire occurrence to pre-season soil moisture conditions across ecosystems. Environmental Research Letters 13(1) (2018) Fisher et al. [2020] Fisher, J.B., Lee, B., Purdy, A.J., Halverson, G.H., Dohlen, M.B., Cawse-Nicholson, K., Wang, A., Anderson, R.G., Aragon, B., Arain, M.A., et al.: Ecostress: Nasa’s next generation mission to measure evapotranspiration from the international space station. Water Resources Research 56(4), 2019–026058 (2020) Fisher et al. [2010] Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Jensen, D., Reager, J.T., Zajic, B., Rousseau, N., Rodell, M., Hinkley, E.: The sensitivity of US wildfire occurrence to pre-season soil moisture conditions across ecosystems. Environmental Research Letters 13(1) (2018) Fisher et al. [2020] Fisher, J.B., Lee, B., Purdy, A.J., Halverson, G.H., Dohlen, M.B., Cawse-Nicholson, K., Wang, A., Anderson, R.G., Aragon, B., Arain, M.A., et al.: Ecostress: Nasa’s next generation mission to measure evapotranspiration from the international space station. Water Resources Research 56(4), 2019–026058 (2020) Fisher et al. [2010] Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, J.B., Lee, B., Purdy, A.J., Halverson, G.H., Dohlen, M.B., Cawse-Nicholson, K., Wang, A., Anderson, R.G., Aragon, B., Arain, M.A., et al.: Ecostress: Nasa’s next generation mission to measure evapotranspiration from the international space station. Water Resources Research 56(4), 2019–026058 (2020) Fisher et al. [2010] Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  6. Jensen, D., Reager, J.T., Zajic, B., Rousseau, N., Rodell, M., Hinkley, E.: The sensitivity of US wildfire occurrence to pre-season soil moisture conditions across ecosystems. Environmental Research Letters 13(1) (2018) Fisher et al. [2020] Fisher, J.B., Lee, B., Purdy, A.J., Halverson, G.H., Dohlen, M.B., Cawse-Nicholson, K., Wang, A., Anderson, R.G., Aragon, B., Arain, M.A., et al.: Ecostress: Nasa’s next generation mission to measure evapotranspiration from the international space station. Water Resources Research 56(4), 2019–026058 (2020) Fisher et al. [2010] Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, J.B., Lee, B., Purdy, A.J., Halverson, G.H., Dohlen, M.B., Cawse-Nicholson, K., Wang, A., Anderson, R.G., Aragon, B., Arain, M.A., et al.: Ecostress: Nasa’s next generation mission to measure evapotranspiration from the international space station. Water Resources Research 56(4), 2019–026058 (2020) Fisher et al. [2010] Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  7. Fisher, J.B., Lee, B., Purdy, A.J., Halverson, G.H., Dohlen, M.B., Cawse-Nicholson, K., Wang, A., Anderson, R.G., Aragon, B., Arain, M.A., et al.: Ecostress: Nasa’s next generation mission to measure evapotranspiration from the international space station. Water Resources Research 56(4), 2019–026058 (2020) Fisher et al. [2010] Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  8. Fisher, J.B., Whittaker, R.J., Malhi, Y.: Et come home: Potential evapotranspiration in geographical ecology. Global Ecology and Biogeography 20(1), 1–18 (2010) https://doi.org/10.1111/j.1466-8238.2010.00578.x Huang et al. [2020] Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  9. Huang, Y., Jin, Y., Schwartz, M.W., Thorne, J.H.: Intensified burn severity in california’s northern coastal mountains by drier climatic condition. Environmental Research Letters 15(10), 104033 (2020) https://doi.org/10.1088/1748-9326/aba6af Kane et al. [2015] Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  10. Kane, V.R., Cansler, C.A., Povak, N.A., Kane, J.T., McGaughey, R.J., Lutz, J.A., Churchill, D.J., North, M.P.: Mixed severity fire effects within the rim fire: Relative importance of local climate, fire weather, topography, and forest structure. Forest Ecology and Management 358, 62–79 (2015) https://doi.org/10.1016/j.foreco.2015.09.001 Hoffman et al. [2015] Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  11. Hoffman, C.M., Canfield, J., Linn, R.R., Mell, W., Sieg, C.H., Pimont, F., Ziegler, J.: Evaluating crown fire rate of spread predictions from physics-based models. Fire Technology 52(1), 221–237 (2015) Jain et al. [2020] Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  12. Jain, P., Coogan, S.C.P., Subramanian, S.G., Crowley, M., Taylor, S., Flannigan, M.D.: A review of machine learning applications in wildfire science and management. Environmental Reviews 28(4), 478–505 (2020) https://doi.org/10.1139/er-2020-0019 Hultquist et al. [2014] Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  13. Hultquist, C., Chen, G., Zhao, K.: A comparison of gaussian process regression, random forests and support vector regression for burn severity assessment in diseased forests. Remote Sensing Letters 5(8), 723–732 (2014) https://doi.org/10.1080/2150704x.2014.963733 Zhou [2021] Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  14. Zhou, Z.-H.: Ensemble learning. SpringerLink (2021) van Breugel et al. [2015] Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  15. Breugel, P., Friis, I., Demissew, S., Lillesø, J.-P.B., Kindt, R.: Current and future fire regimes and their influence on natural vegetation in ethiopia. Ecosystems 19(2), 369–386 (2015) https://doi.org/10.1007/s10021-015-9938-x van der Laan et al. [2007] Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  16. Laan, M.J., Polley, E.C., Hubbard, A.E.: Super Learner. U.C Berkeley Divison of Biostatistics Working Paper Series Working Paper 222 (2007) Wikle and Zammit-Mangion [2023] Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  17. Wikle, C.K., Zammit-Mangion, A.: Statistical deep learning for spatial and spatiotemporal data. Statistical Deep Learning for Spatial and Spatiotemporal Data (2023) Fayad et al. [2016] Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  18. Fayad, I., Baghdadi, N., Bailly, J.-S., Barbier, N., Gond, V., Héraul, B., Haj, M.E., Fabre, F., Perrin, J.: Regional Scale Rain-Forest Height Mapping Using Regression-Kriging of Spaceborne and Airborne LiDAR Data: Application on French Guiana . Journal of Remote Sensing 8(3) (2016) Hengl et al. [2018] Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  19. Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink4, G.B.M., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables . PeerJ 6 (2018) Li et al. [2011] Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  20. Li, J., Heap, A.D., Potter, A., Daniell, J.J.: Application of machine learning methods to spatial interpolation of environmental variables‘ . Enviromental Modeling and Software 26(12) (2011) Koike et al. [2001] Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  21. Koike, K., Matsuda, S., Gu, B.: Evaluation of Interpolation Accuracy of Neural Kriging with Application to Temperature-Distribution Analysis . Mathematical Geology 33 (2001) Yasrebi et al. [2020] Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  22. Yasrebi, A.B., Hezarkhani, A., Afzal, P., Karami, R., Tehrani, M.E., Borumandnia, A.: Application of an ordinary kriging–artificial neural network for elemental distribution in Kahang porphyry deposit, Central Iran . Arabian Journal of Geosciences 13(748) (2020) Saha et al. [2021] Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  23. Saha, A., Basu, S., Datta, A.: Random forests for spatially dependent data. Journal of the American Statistical Association 118(541), 665–683 (2021) https://doi.org/10.1080/01621459.2021.1950003 Davies and van der Laan [2016] Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  24. Davies, M.M., Laan, M.J.: Optimal Spatial Prediction Using Ensemble Machine Learning . International Journal of Biostatistics 12(1) (2016) Gamze Erdogan Erten [2022] Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  25. Gamze Erdogan Erten, C.V.D. Mahmut Yavuz: Combination of Machine Learning and Kriging for Spatial Estimation of Geological Attributes . Natural Resources Research 31(1) (2022) California Department of Forestry and Fire Protection [2023] California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  26. California Department of Forestry and Fire Protection: CAL Fire Incidents. https://www.fire.ca.gov/incidents/ Accessed 2023-08-20 InciWeb [2022] InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  27. InciWeb: Incident Information System (2022). https://inciweb.nwcg.gov/ Hijmans [2023] Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  28. Hijmans, R.J.: Geographic Data Analysis and modeling [R package raster version 3.6-23]. Comprehensive R Archive Network (CRAN) (2023). https://cran.r-project.org/web/packages/raster/index.html Zou and Hastie [2005] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  29. Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology 67(2), 301–320 (2005) Morgan and Sonquist [1963] Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  30. Morgan, J.N., Sonquist, J.A.: Problems in the analysis of survey data, and a proposal. Journal of the American statistical association 58(302), 415–434 (1963) Hoerl and Kennard [1970a] Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  31. Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970) Hoerl and Kennard [1970b] Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  32. Hoerl, A.E., Kennard, R.W.: Ridge regression: applications to nonorthogonal problems. Technometrics 12(1), 69–82 (1970) Tibshirani [1996] Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  33. Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology 58(1), 267–288 (1996) Cover and Hart [1967] Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  34. Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE transactions on information theory 13(1), 21–27 (1967) Friedman [2001] Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  35. Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232 (2001) Chen and Guestrin [2016] Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  36. Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016) Breiman [1996] Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  37. Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  38. Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Geurts et al. [2006] Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  39. Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Machine learning 63, 3–42 (2006) Rosenblatt [1958] Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  40. Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65(6), 386 (1958) Vecchia [1988] Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  41. Vecchia, A.V.: Estimation and model identification for continuous spatial processes. Journal of the Royal Statistical Society: Series B (Methodological) 50(2), 297–312 (1988) https://doi.org/10.1111/j.2517-6161.1988.tb01729.x Guinnes et al. [2021] Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  42. Guinnes, J., Katzfuss, M., Fahmy, Y.: Gpgp: Fast gaussian process computation using vecchia’s approximation. The Comprehensive R Archive Network (2021) Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  43. Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Fisher et al. [2019] Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  44. Fisher, A., Rudin, C., Dominici, F.: All Models are Wrong, but Many are Useful: Learning a Variable’s Importance by Studying an Entire Class of Prediction Models Simultaneously. J. Mach. Learn. Res. 20(177), 1–81 (2019) Okoli [2023] Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  45. Okoli, C.: Ale: Interpretable Machine Learning and Statistical Inference with Accumulated Local Effects (ALE). (2023). R package version 0.1.0. https://CRAN.R-project.org/package=ale Guinness [2021] Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  46. Guinness, J.: Gaussian process learning via Fisher scoring of Vecchia’s approximation. Journal of Statistics and Computing 31(26) (2021) Allard et al. [2021] Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  47. Allard, D., Clarotto, L., Opitz, T., Romary, T.: Discussion on Competition on Spatial Statistics for Large Datasets. Journal of Agricultural, Biological, and Environmental Statistics 26(4) (2021) Katzfuss et al. [2020] Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020) Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
  48. Katzfuss, M., Guinness, J., Gong, W., Zilber, D.: Vecchia approximations of Gaussian-process predictions . Journal of Agricultural, Biological, and Environmental Statistics 25(3) (2020)
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Nicholas Simafranca (1 paper)
  2. Bryant Willoughby (1 paper)
  3. Erin O'Neil (2 papers)
  4. Sophie Farr (1 paper)
  5. Naomi Giertych (4 papers)
  6. Margaret Johnson (5 papers)
  7. Madeleine Pascolini-Campbell (1 paper)
  8. Brian J Reich (17 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.