Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Causal hybrid modeling with double machine learning (2402.13332v2)

Published 20 Feb 2024 in cs.LG and stat.ME

Abstract: Hybrid modeling integrates machine learning with scientific knowledge to enhance interpretability, generalization, and adherence to natural laws. Nevertheless, equifinality and regularization biases pose challenges in hybrid modeling to achieve these purposes. This paper introduces a novel approach to estimating hybrid models via a causal inference framework, specifically employing Double Machine Learning (DML) to estimate causal effects. We showcase its use for the Earth sciences on two problems related to carbon dioxide fluxes. In the $Q_{10}$ model, we demonstrate that DML-based hybrid modeling is superior in estimating causal parameters over end-to-end deep neural network (DNN) approaches, proving efficiency, robustness to bias from regularization methods, and circumventing equifinality. Our approach, applied to carbon flux partitioning, exhibits flexibility in accommodating heterogeneous causal effects. The study emphasizes the necessity of explicitly defining causal graphs and relationships, advocating for this as a general best practice. We encourage the continued exploration of causality in hybrid models for more interpretable and trustworthy results in knowledge-guided machine learning.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (61)
  1. A. Kirillov, E. Mintun, N. Ravi, H. Mao, C. Rolland, L. Gustafson, T. Xiao, S. Whitehead, A. C. Berg, W.-Y. Lo, P. Dollár, and R. Girshick, “Segment anything,” 2023.
  2. T. B. Brown, B. Mann, N. Ryder, M. Subbiah, J. Kaplan, P. Dhariwal, A. Neelakantan, P. Shyam, G. Sastry, A. Askell, S. Agarwal, A. Herbert-Voss, G. Krueger, T. Henighan, R. Child, A. Ramesh, D. M. Ziegler, J. Wu, C. Winter, C. Hesse, M. Chen, E. Sigler, M. Litwin, S. Gray, B. Chess, J. Clark, C. Berner, S. McCandlish, A. Radford, I. Sutskever, and D. Amodei, “Language models are few-shot learners,” 2020.
  3. Y. Zhang, J. Qin, D. S. Park, W. Han, C.-C. Chiu, R. Pang, Q. V. Le, and Y. Wu, “Pushing the limits of semi-supervised learning for automatic speech recognition,” 2022.
  4. A. Halevy, P. Norvig, and F. Pereira, “The unreasonable effectiveness of data,” IEEE Intelligent Systems, vol. 24, no. 2, pp. 8–12, Mar. 2009. [Online]. Available: http://dx.doi.org/10.1109/MIS.2009.36
  5. Z. C. Lipton, “The mythos of model interpretability,” Queue, vol. 16, no. 3, pp. 30:31–30:57, Jun. 2018. [Online]. Available: http://doi.acm.org/10.1145/3236386.3241340
  6. G. Tramontana, M. Jung, G. Camps-Valls, K. Ichii, B. Raduly, M. Reichstein, C. R. Schwalm, M. A. Arain, A. Cescatti, G. Kiely, L. Merbold, P. Serrano-Ortiz, S. Sickert, S. Wolf, and D. Papale, “Predicting carbon dioxide and energy fluxes across global FLUXNET sites with regression algorithms,” Biogeosciences Discussions, vol. 2016, pp. 1–33, 2016. [Online]. Available: http://www.biogeosciences-discuss.net/bg-2015-661/
  7. C. Rudin and J. Radin, “Why are we using black box models in AI when we don’t need to? A lesson from an explainable AI competition,” Harvard Data Science Review, vol. 1, no. 2, nov 22 2019, https://hdsr.mitpress.mit.edu/pub/f9kuryi8.
  8. J. Quionero-Candela, M. Sugiyama, A. Schwaighofer, and N. D. Lawrence, “Dataset shift in machine learning,” 2009.
  9. M. Reichstein, G. Camps-Valls, B. Stevens, J. Denzler, N. Carvalhais, M. Jung, and Prabhat, “Deep learning and process understanding for data-driven E arth system science,” Nature, vol. 566, pp. 195–204, Feb 2019.
  10. G. Marcus, “Deep learning: A critical appraisal,” arXiv preprint arXiv:1801.00631, 2018.
  11. R. Roscher, B. Bohn, M. Duarte, and J. Garcke, “Explainable machine learning for scientific insights and discoveries,” IEEE Access, vol. PP, pp. 1–1, 02 2020.
  12. P. Linardatos, V. Papastefanopoulos, and S. Kotsiantis, “Explainable AI: A review of machine learning interpretability methods,” Entropy, vol. 23, no. 1, 2021. [Online]. Available: https://www.mdpi.com/1099-4300/23/1/18
  13. B. Neyshabur, S. Bhojanapalli, D. Mcallester, and N. Srebro, “Exploring generalization in deep learning,” in Advances in Neural Information Processing Systems, I. Guyon, U. V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett, Eds., vol. 30.   Curran Associates, Inc., 2017. [Online]. Available: https://proceedings.neurips.cc/paper_files/paper/2017/file/10ce03a1ed01077e3e289f3e53c72813-Paper.pdf
  14. J. Wang, C. Lan, C. Liu, Y. Ouyang, T. Qin, W. Lu, Y. Chen, W. Zeng, and P. Yu, “Generalizing to unseen domains: A survey on domain generalization,” IEEE Transactions on Knowledge and Data Engineering, pp. 1–1, 2022.
  15. X. Shen and N. Meinshausen, “Engression: Extrapolation for nonlinear regression?” 2023.
  16. E. de Bezenac, A. Pajot, and P. Gallinari, “Deep learning for physical processes: Incorporating prior scientific knowledge,” in International Conference on Learning Representations, 2018. [Online]. Available: https://openreview.net/forum?id=By4HsfWAZ
  17. G. Camps-Valls, D. Svendsen, L. Martino, J. Muñoz-Marí, V. Laparra, M. Campos-Taberner, and D. Luengo, “Physics-aware Gaussian processes in remote sensing,” Applied Soft Computing, vol. 68, pp. 69–82, Jul 2018.
  18. J. Cortés-Andrés, G. Camps-Valls, S. Sippel, E. Székely, D. Sejdinovic, E. Diaz, A. Pérez-Suay, Z. Li, M. Mahecha, and M. Reichstein, “Physics-aware nonparametric regression models for Earth data analysis,” Environmental Research Letters, vol. 17, no. 5, 2022.
  19. M. Raissi, P. Perdikaris, and G. Karniadakis, “Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations,” Jour. Comp. Phys., vol. 378, pp. 686–707, 2019.
  20. W. L. Zhao, P. Gentine, M. Reichstein, Y. Zhang, S. Zhou, Y. Wen, C. Lin, X. Li, and G. Y. Qiu, “Physics-constrained machine learning of evapotranspiration,” Geophysical Research Letters, vol. 46, no. 24, pp. 14 496–14 507, 2019.
  21. M. Reichstein, B. Ahrens, B. Kraft, G. Camps-Valls, N. Carvalhais, F. Gans, P. Gentine, and A. Winkler, “Combining system modeling and machine learning into hybrid ecosystem modeling,” in Knowledge-Guided Machine Learning, 2022.
  22. A. Koppa, D. Rains, P. Hulsman, R. Poyatos, and D. G. Miralles, “A deep learning-based hybrid model of global terrestrial evaporation,” Nature Communications, vol. 13, no. 1, p. 1912, Apr 2022. [Online]. Available: https://doi.org/10.1038/s41467-022-29543-7
  23. J. Oberpriller, D. R. Cameron, M. C. Dietze, and F. Hartig, “Towards robust statistical inference for complex computer models,” Ecology Letters, vol. 24, no. 6, pp. 1251–1261, 2021. [Online]. Available: https://onlinelibrary.wiley.com/doi/abs/10.1111/ele.13728
  24. M. Abdar, F. Pourpanah, S. Hussain, D. Rezazadegan, L. Liu, M. Ghavamzadeh, P. Fieguth, X. Cao, A. Khosravi, U. R. Acharya, V. Makarenkov, and S. Nahavandi, “A review of uncertainty quantification in deep learning: Techniques, applications and challenges,” Information Fusion, vol. 76, pp. 243–297, 2021. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S1566253521001081
  25. P. Izmailov, S. Vikram, M. D. Hoffman, and A. G. Wilson, “What are bayesian neural network posteriors really like?” in International Conference on Machine Learning, 2021. [Online]. Available: https://api.semanticscholar.org/CorpusID:233443782
  26. M. U. Kirschbaum, “Will changes in soil organic carbon act as a positive or negative feedback on global warming?” Biogeochemistry, vol. 48, pp. 21–51, 2000.
  27. N. G. Smith and J. S. Dukes, “Plant respiration and photosynthesis in global-scale models: incorporating acclimation to temperature and CO2,” Global change biology, vol. 19, no. 1, pp. 45–63, 2013.
  28. C. Huntingford, O. K. Atkin, A. Martinez-De La Torre, L. M. Mercado, M. A. Heskel, A. B. Harper, K. J. Bloomfield, O. S. Osullivan, P. B. Reich, K. R. Wythers et al., “Implications of improved representations of plant respiration in a changing climate,” Nature Communications, vol. 8, no. 1, p. 1602, 2017.
  29. G. Vardi, “On the implicit bias in deep-learning algorithms,” Commun. ACM, vol. 66, no. 6, pp. 86–93, may 2023. [Online]. Available: https://doi.org/10.1145/3571070
  30. W. Zhan, X. Yang, Y. Ryu, B. Dechant, Y. Huang, Y. Goulas, M. Kang, and P. Gentine, “Two for one: Partitioning CO2 fluxes and understanding the relationship between solar-induced chlorophyll fluorescence and gross primary productivity using machine learning,” Agricultural and Forest Meteorology, vol. 321, p. 108980, 2022. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0168192322001708
  31. R. ElGhawi, B. Kraft, C. Reimers, M. Reichstein, M. Körner, P. Gentine, and A. J. Winkler, “Hybrid modeling of evapotranspiration: inferring stomatal and aerodynamic resistances using combined physics-based and machine learning,” Environmental Research Letters, vol. 18, no. 3, p. 034039, mar 2023. [Online]. Available: https://dx.doi.org/10.1088/1748-9326/acbbe0
  32. Y. Yin, V. L. Guen, J. Dona, E. de Bézenac, I. Ayed, N. Thome, and P. Gallinari, “Augmenting physical models with deep networks for complex dynamics forecasting*,” Journal of Statistical Mechanics: Theory and Experiment, vol. 2021, no. 12, p. 124012, dec 2021. [Online]. Available: https://dx.doi.org/10.1088/1742-5468/ac3ae5
  33. V. Chernozhukov, D. Chetverikov, M. Demirer, E. Duflo, C. Hansen, W. Newey, and J. Robins, “Double/debiased machine learning for treatment and structural parameters,” The Econometrics Journal, vol. 21, no. 1, pp. C1–C68, 01 2018. [Online]. Available: https://doi.org/10.1111/ectj.12097
  34. J. Runge, A. Gerhardus, G. Varando, V. Eyring, and G. Camps-Valls, “Causal inference for time series,” Nature Reviews Earth & Environment, vol. 10, p. 2553, 2023.
  35. S. Wang, S. Sankaran, and P. Perdikaris, “Respecting causality is all you need for training physics-informed neural networks,” 2022.
  36. F. Iglesias-Suarez, P. Gentine, B. Solino-Fernandez, T. Beucler, M. Pritchard, J. Runge, and V. Eyring, “Causally-informed deep learning to improve climate models and projections,” 2023.
  37. J. Runge, S. Bathiany, E. Bollt, G. Camps-Valls, D. Coumou, E. Deyle, C. Clymour, M. Kretschmer, M. Mahecha, J. Muñoz-Marí, E. van Nes, J. Peters, R. Quax, M. Reichstein, M. Scheffer, B. Schölkopf, P. Spirtes, G. Sugihara, J. Sun, K. Zhang, and J. Zscheischler, “Inferring causation from time series with perspectives in Earth system sciences,” Nature Communications, no. 2553, pp. 1–13, 2019.
  38. M. C. Knaus, M. Lechner, and A. Strittmatter, “Heterogeneous employment effects of job search programs,” Journal of Human Resources, vol. 57, no. 2, pp. 597–636, mar 2020. [Online]. Available: https://doi.org/10.3368%2Fjhr.57.2.0718-9615r1
  39. J. M. Davis and S. B. Heller, “Using causal forests to predict treatment heterogeneity: An application to summer jobs,” The American Economic Review, vol. 107, no. 5, pp. 546–550, 2017. [Online]. Available: http://www.jstor.org/stable/44250458
  40. Q. Sun, T. Zheng, X. Zheng, M. Cao, B. Zhang, and S. Jiang, “Causal interpretation for groundwater exploitation strategy in a coastal aquifer,” Science of The Total Environment, vol. 867, p. 161443, 2023. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S004896972300058X
  41. S. Arrhenius, “Über die reaktionsgeschwindigkeit bei der inversion von rohrzucker durch säuren,” Zeitschrift für physikalische Chemie, vol. 4, no. 1, pp. 226–248, 1889.
  42. J. H. Van’t Hoff, R. A. Lehfeldt et al., “Lectures on theoretical and physical chemistry,” 1899.
  43. J. Lloyd and J. Taylor, “On the temperature dependence of soil respiration,” Functional ecology, pp. 315–323, 1994.
  44. Y. Pei, J. Dong, Y. Zhang, W. Yuan, R. Doughty, J. Yang, D. Zhou, L. Zhang, and X. Xiao, “Evolution of light use efficiency models: Improvement, uncertainties, and implications,” Agricultural and Forest Meteorology, vol. 317, p. 108905, 2022. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0168192322000983
  45. D. Baldocchi, E. Falge, L. Gu, R. Olson, D. Hollinger, S. Running, P. Anthoni, C. Bernhofer, K. Davis, R. Evans, J. Fuentes, A. Goldstein, G. Katul, B. Law, X. Lee, Y. Malhi, T. Meyers, W. Munger, W. Oechel, Paw U,K.T., K. Pilegaard, H. Schmid, R. Valentini, S. Verma, T. Vesala, K. Wilson, and S. Wofsy, “Fluxnet: A new tool to study the temporal and spatial variability of ecosystem-scale carbon dioxide, water vapor, and energy flux densities,” Bulletin of the American Meteorological Society, vol. 82, no. 11, pp. 2415–2434, 2001.
  46. G. Pastorello, C. Trotta, E. Canfora, H. Chu, D. Christianson, Y.-W. Cheah, C. Poindexter, J. Chen, A. Elbashandy, M. Humphrey, P. Isaac, D. Polidori, M. Reichstein, A. Ribeca, C. van Ingen, N. Vuichard, L. Zhang, B. Amiro, C. Ammann, M. A. Arain, J. Ardö, T. Arkebauer, S. K. Arndt, N. Arriga, M. Aubinet, M. Aurela, D. Baldocchi, A. Barr, E. Beamesderfer, L. B. Marchesini, O. Bergeron, J. Beringer, C. Bernhofer, D. Berveiller, D. Billesbach, T. A. Black, P. D. Blanken, G. Bohrer, J. Boike, P. V. Bolstad, D. Bonal, J.-M. Bonnefond, D. R. Bowling, R. Bracho, J. Brodeur, C. Brümmer, N. Buchmann, B. Burban, S. P. Burns, P. Buysse, P. Cale, M. Cavagna, P. Cellier, S. Chen, I. Chini, T. R. Christensen, J. Cleverly, A. Collalti, C. Consalvo, B. D. Cook, D. Cook, C. Coursolle, E. Cremonese, P. S. Curtis, E. D’Andrea, H. da Rocha, X. Dai, K. J. Davis, B. D. Cinti, A. d. Grandcourt, A. D. Ligne, R. C. De Oliveira, N. Delpierre, A. R. Desai, C. M. Di Bella, P. d. Tommasi, H. Dolman, F. Domingo, G. Dong, S. Dore, P. Duce, E. Dufrêne, A. Dunn, J. Dušek, D. Eamus, U. Eichelmann, H. A. M. ElKhidir, W. Eugster, C. M. Ewenz, B. Ewers, D. Famulari, S. Fares, I. Feigenwinter, A. Feitz, R. Fensholt, G. Filippa, M. Fischer, J. Frank, M. Galvagno, M. Gharun, D. Gianelle, B. Gielen, B. Gioli, A. Gitelson, I. Goded, M. Goeckede, A. H. Goldstein, C. M. Gough, M. L. Goulden, A. Graf, A. Griebel, C. Gruening, T. Grünwald, A. Hammerle, S. Han, X. Han, B. U. Hansen, C. Hanson, J. Hatakka, Y. He, M. Hehn, B. Heinesch, N. Hinko-Najera, L. Hörtnagl, L. Hutley, A. Ibrom, H. Ikawa, M. Jackowicz-Korczynski, D. Janouš, W. Jans, R. Jassal, S. Jiang, T. Kato, M. Khomik, J. Klatt, A. Knohl, S. Knox, H. Kobayashi, G. Koerber, O. Kolle, Y. Kosugi, A. Kotani, A. Kowalski, B. Kruijt, J. Kurbatova, W. L. Kutsch, H. Kwon, S. Launiainen, T. Laurila, B. Law, R. Leuning, Y. Li, M. Liddell, J.-M. Limousin, M. Lion, A. J. Liska, A. Lohila, A. López-Ballesteros, E. López-Blanco, B. Loubet, D. Loustau, A. Lucas-Moffat, J. Lüers, S. Ma, C. Macfarlane, V. Magliulo, R. Maier, I. Mammarella, G. Manca, B. Marcolla, H. A. Margolis, S. Marras, W. Massman, M. Mastepanov, R. Matamala, J. H. Matthes, F. Mazzenga, H. McCaughey, I. McHugh, A. M. S. McMillan, L. Merbold, W. Meyer, T. Meyers, S. D. Miller, S. Minerbi, U. Moderow, R. K. Monson, L. Montagnani, C. E. Moore, E. Moors, V. Moreaux, C. Moureaux, J. W. Munger, T. Nakai, J. Neirynck, Z. Nesic, G. Nicolini, A. Noormets, M. Northwood, M. Nosetto, Y. Nouvellon, K. Novick, W. Oechel, J. E. Olesen, J.-M. Ourcival, S. A. Papuga, F.-J. Parmentier, E. Paul-Limoges, M. Pavelka, M. Peichl, E. Pendall, R. P. Phillips, K. Pilegaard, N. Pirk, G. Posse, T. Powell, H. Prasse, S. M. Prober, S. Rambal, Ü. Rannik, N. Raz-Yaseef, C. Rebmann, D. Reed, V. R. d. Dios, N. Restrepo-Coupe, B. R. Reverter, M. Roland, S. Sabbatini, T. Sachs, S. R. Saleska, E. P. Sánchez-Cañete, Z. M. Sanchez-Mejia, H. P. Schmid, M. Schmidt, K. Schneider, F. Schrader, I. Schroder, R. L. Scott, P. Sedlák, P. Serrano-Ortíz, C. Shao, P. Shi, I. Shironya, L. Siebicke, L. Šigut, R. Silberstein, C. Sirca, D. Spano, R. Steinbrecher, R. M. Stevens, C. Sturtevant, A. Suyker, T. Tagesson, S. Takanashi, Y. Tang, N. Tapper, J. Thom, M. Tomassucci, J.-P. Tuovinen, S. Urbanski, R. Valentini, M. van der Molen, E. van Gorsel, K. van Huissteden, A. Varlagin, J. Verfaillie, T. Vesala, C. Vincke, D. Vitale, N. Vygodskaya, J. P. Walker, E. Walter-Shea, H. Wang, R. Weber, S. Westermann, C. Wille, S. Wofsy, G. Wohlfahrt, S. Wolf, W. Woodgate, Y. Li, R. Zampedri, J. Zhang, G. Zhou, D. Zona, D. Agarwal, S. Biraud, M. Torn, and D. Papale, “The FLUXNET2015 dataset and the ONEFlux processing pipeline for eddy covariance data,” Scientific Data, vol. 7, no. 1, p. 225, Jul. 2020.
  47. M. Reichstein, E. Falge, D. Baldocchi, D. Papale, M. Aubinet, P. Berbigier, C. Bernhofer, N. Buchmann, T. Gilmanov, A. Granier, T. Grünwald, K. Havránková, H. Ilvesniemi, D. Janous, A. Knohl, T. Laurila, A. Lohila, D. Loustau, G. Matteucci, T. Meyers, F. Miglietta, J.-M. Ourcival, J. Pumpanen, S. Rambal, E. Rotenberg, M. Sanz, J. Tenhunen, G. Seufert, F. Vaccari, T. Vesala, D. Yakir, and R. Valentini, “On the separation of net ecosystem exchange into assimilation and ecosystem respiration: review and improved algorithm,” Global Change Biology, vol. 11, no. 9, pp. 1424–1439, 2005. [Online]. Available: https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1365-2486.2005.001002.x
  48. A. M. Moffat, D. Papale, M. Reichstein, D. Y. Hollinger, A. D. Richardson, A. G. Barr, C. Beckstein, B. H. Braswell, G. Churkina, A. R. Desai, E. Falge, J. H. Gove, M. Heimann, D. Hui, A. J. Jarvis, J. Kattge, A. Noormets, and V. J. Stauch, “Comprehensive comparison of gap-filling techniques for eddy covariance net carbon fluxes,” Agricultural and Forest Meteorology, vol. 147, no. 3, pp. 209–232, 2007. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S016819230700216X
  49. A. R. Desai, A. D. Richardson, A. M. Moffat, J. Kattge, D. Hollinger, A. G. Barr, E. Falge, A. Noormets, D. Papale, M. Reichstein, and V. J. Stauch, “Cross-site evaluation of eddy covariance GPP and RE decomposition techniques,” Agricultural and Forest Meteorology, vol. 148, pp. 821–838, 2008.
  50. G. Lasslop, M. Reichstein, D. Papale, A. D. Richardson, A. Arneth, A. Barr, P. Stoy, and G. Wohlfahrt, “Separation of net ecosystem exchange into assimilation and respiration using a light response curve approach: critical issues and global evaluation,” Global Change Biology, vol. 16, no. 1, pp. 187–208, 2010. [Online]. Available: https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1365-2486.2009.02041.x
  51. T. F. Keenan, M. Migliavacca, D. Papale, D. Baldocchi, M. Reichstein, M. Torn, and T. Wutzler, “Widespread inhibition of daytime ecosystem respiration,” Nature Ecology & Evolution, vol. 3, no. 3, pp. 407–415, Mar 2019. [Online]. Available: https://doi.org/10.1038/s41559-019-0809-2
  52. G. Tramontana, M. Migliavacca, M. Jung, M. Reichstein, T. F. Keenan, G. Camps-Valls, J. Ogee, J. Verrelst, and D. Papale, “Partitioning net carbon dioxide fluxes into photosynthesis and respiration using neural networks,” Global Change Biology, vol. 26, no. 9, pp. 5235–5253, 2020. [Online]. Available: https://onlinelibrary.wiley.com/doi/abs/10.1111/gcb.15203
  53. V. T. Trifunov, M. Shadaydeh, J. Runge, M. Reichstein, and J. Denzler, “A data-driven approach to partitioning net ecosystem exchange using a deep state space model,” IEEE Access, vol. 9, pp. 107 873–107 883, 2021.
  54. N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever, and R. Salakhutdinov, “Dropout: A simple way to prevent neural networks from overfitting,” Journal of Machine Learning Research, vol. 15, no. 56, pp. 1929–1958, 2014. [Online]. Available: http://jmlr.org/papers/v15/srivastava14a.html
  55. Y. Gal and Z. Ghahramani, “Dropout as a Bayesian approximation: Representing model uncertainty in deep learning,” in Proceedings of The 33rd International Conference on Machine Learning, ser. Proceedings of Machine Learning Research, M. F. Balcan and K. Q. Weinberger, Eds., vol. 48.   New York, New York, USA: PMLR, 20–22 Jun 2016, pp. 1050–1059. [Online]. Available: https://proceedings.mlr.press/v48/gal16.html
  56. M. D. Mahecha, M. Reichstein, N. Carvalhais, G. Lasslop, H. Lange, S. I. Seneviratne, R. Vargas, C. Ammann, M. A. Arain, A. Cescatti, I. A. Janssens, M. Migliavacca, L. Montagnani, and A. D. Richardson, “Global convergence in the temperature sensitivity of respiration at ecosystem level,” Science, vol. 329, no. 5993, pp. 838–840, 2010. [Online]. Available: https://www.science.org/doi/abs/10.1126/science.1189587
  57. E. Falge, D. Baldocchi, R. Olson, P. Anthoni, M. Aubinet, C. Bernhofer, G. Burba, R. Ceulemans, R. Clement, H. Dolman, A. Granier, P. Gross, T. Grünwald, D. Hollinger, N.-O. Jensen, G. Katul, P. Keronen, A. Kowalski, C. T. Lai, B. E. Law, T. Meyers, J. Moncrieff, E. Moors, J. Munger, K. Pilegaard, Ü. Rannik, C. Rebmann, A. Suyker, J. Tenhunen, K. Tu, S. Verma, T. Vesala, K. Wilson, and S. Wofsy, “Gap filling strategies for defensible annual sums of net ecosystem exchange,” Agricultural and Forest Meteorology, vol. 107, no. 1, pp. 43–69, 2001. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0168192300002252
  58. R. L. Scott, J. A. Biederman, E. P. Hamerlynck, and G. A. Barron-Gafford, “The carbon balance pivot point of southwestern u.s. semiarid ecosystems: Insights from the 21st century drought,” Journal of Geophysical Research: Biogeosciences, vol. 120, no. 12, pp. 2612–2624, 2015. [Online]. Available: https://agupubs.onlinelibrary.wiley.com/doi/abs/10.1002/2015JG003181
  59. D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” 2017.
  60. S. Athey, J. Tibshirani, and S. Wager, “Generalized random forests,” The Annals of Statistics, vol. 47, no. 2, pp. 1148 – 1178, 2019. [Online]. Available: https://doi.org/10.1214/18-AOS1709
  61. D. J. Foster and V. Syrgkanis, “Orthogonal statistical learning,” 2020.
Citations (1)

Summary

We haven't generated a summary for this paper yet.