Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Hybrid Modeling Design Patterns (2401.00033v1)

Published 29 Dec 2023 in cs.AI and cs.LG

Abstract: Design patterns provide a systematic way to convey solutions to recurring modeling challenges. This paper introduces design patterns for hybrid modeling, an approach that combines modeling based on first principles with data-driven modeling techniques. While both approaches have complementary advantages there are often multiple ways to combine them into a hybrid model, and the appropriate solution will depend on the problem at hand. In this paper, we provide four base patterns that can serve as blueprints for combining data-driven components with domain knowledge into a hybrid approach. In addition, we also present two composition patterns that govern the combination of the base patterns into more complex hybrid models. Each design pattern is illustrated by typical use cases from application areas such as climate modeling, engineering, and physics.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (59)
  1. Empirical mode decomposition based deep learning for electricity demand forecasting. IEEE access, 6:49144–49156, 2018.
  2. Achieving conservation of energy in neural network emulators for climate modeling. arXiv preprint arXiv:1906.06622, 2019.
  3. Pattern recognition and machine learning, volume 4. Springer, 2006.
  4. Leo Breiman. Bagging predictors. Machine learning, 24:123–140, 1996.
  5. John S Bridle. Probabilistic interpretation of feedforward classification network outputs, with relationships to statistical pattern recognition. In Neurocomputing: Algorithms, architectures and applications, pages 227–236. Springer, 1990.
  6. Geometric deep learning: going beyond Euclidean data. IEEE Signal Processing Magazine, 34(4):18–42, 2017.
  7. Geometric deep learning: Grids, groups, graphs, geodesics, and gauges. arXiv preprint arXiv:2104.13478, 2021.
  8. Wavey-net: Physics-augmented deep learning for high-speed electromagnetic simulation and optimization. CoRR, abs/2203.01248, 2022. URL https://doi.org/10.48550/arXiv.2203.01248.
  9. Neural ordinary differential equations. Advances in neural information processing systems, 31, 2018.
  10. Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:1412.3555, 2014.
  11. Lagrangian neural networks. arXiv preprint arXiv:2003.04630, 2020.
  12. Physics-guided neural networks (PGNN): An application in lake temperature modeling. arXiv preprint arXiv:1710.11431, 2017.
  13. Deep learning for physical processes: Incorporating prior scientific knowledge. Journal of Statistical Mechanics: Theory and Experiment, 2019(12):124009, 2019.
  14. Combining slow and fast: Complementary filtering for dynamics learning. arXiv preprint arXiv:2302.13754, 2023.
  15. Data-driven solvers for strongly nonlinear material response. International Journal for Numerical Methods in Engineering, 122(6):1538–1562, 2021.
  16. Sensorless rotor temperature estimation of permanent magnet synchronous motor. In IECON 2011-37th Annual Conference of the IEEE Industrial Electronics Society, pages 2018–2023. IEEE, 2011.
  17. Deep learning. MIT press, 2016.
  18. Johan Grasman. Asymptotic methods for relaxation oscillations and applications. Applied Mathematical Sciences, 1987.
  19. Hamiltonian neural networks. Advances in neural information processing systems, 32, 2019.
  20. Rolling-element bearings. Technical report, 1983.
  21. Learning partially known stochastic dynamics with empirical PAC Bayes. In International Conference on Artificial Intelligence and Statistics, pages 478–486. PMLR, 2021.
  22. CNN architectures for large-scale audio classification. In 2017 ieee international conference on acoustics, speech and signal processing (icassp), pages 131–135. IEEE, 2017.
  23. The ecological detective: confronting models with data (MPB-28). Princeton University Press, 2013.
  24. Long short-term memory. Neural computation, 9(8):1735–1780, 1997.
  25. ECG arrhythmia classification using STFT-based spectrogram and convolutional neural network. IEEE access, 7:92871–92880, 2019.
  26. Andrew H Jazwinski. Stochastic processes and filtering theory. Courier Corporation, 2007.
  27. Unscented filtering and nonlinear estimation. Proceedings of the IEEE, 92(3):401–422, 2004.
  28. Rudolph Emil Kálmán. A new approach to linear filtering and prediction problems. 1960.
  29. Theory-guided data science: A new paradigm for scientific discovery from data. IEEE Transactions on knowledge and data engineering, 29(10):2318–2331, 2017.
  30. Hybrid modeling: towards the next level of scientific computing in engineering. Journal of Mathematics in Industry, 12(1):1–12, 2022.
  31. A theoretical framework for back-propagation. In Proceedings of the 1988 connectionist models summer school, volume 1, pages 21–28. San Mateo, CA, USA, 1988.
  32. Multi-fidelity physics-constrained neural network and its application in materials modeling. Journal of Mechanical Design, 141(12), 2019.
  33. Virtual sensing techniques and their applications. In 2009 International Conference on Networking, Sensing and Control, pages 31–36. IEEE, 2009.
  34. Hybridnet: integrating model-based and data-driven learning to predict evolution of dynamical systems. In Conference on Robot Learning, pages 551–560. PMLR, 2018a.
  35. PDE-Net: Learning PDEs from data. In International conference on machine learning, pages 3208–3216. PMLR, 2018b.
  36. PDE-Net 2.0: Learning PDEs from data with a numeric-symbolic hybrid deep network. Journal of Computational Physics, 399:108925, 2019.
  37. Deep learning-based object classification on automotive radar spectra. In 2019 IEEE Radar Conference (RadarConf), pages 1–6. IEEE, 2019.
  38. Neural transformation learning for deep anomaly detection beyond images. In International Conference on Machine Learning, pages 8703–8714. PMLR, 2021.
  39. Introduction to digital speech processing. Now Publishers, Inc., 2007.
  40. Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational physics, 378:686–707, 2019.
  41. Deep learning and process understanding for data-driven earth system science. Nature, 566(7743):195–204, 2019.
  42. Physics-based convolutional neural network for fault diagnosis of rolling element bearings. IEEE Sensors Journal, 19(11):4181–4192, 2019.
  43. Bayesian filtering and smoothing, volume 17. Cambridge university press, 2023.
  44. Robert E Schapire. The strength of weak learnability. Machine learning, 5:197–227, 1990.
  45. Modeling irregular time series with continuous recurrent units. In International Conference on Machine Learning, pages 19388–19405. PMLR, 2022.
  46. Label-free supervision of neural networks with physics and domain knowledge. In Thirty-First AAAI Conference on Artificial Intelligence, 2017.
  47. A deep learning approach to antibiotic discovery. Cell, 180(4):688–702, 2020.
  48. Modeling chemical processes using prior knowledge and neural networks. AIChE Journal, 40(8):1328–1340, 1994.
  49. Hamiltonian generative networks. arXiv preprint arXiv:1909.13789, 2019.
  50. Accelerometer-based tilt estimation of a rigid body with only rotational degrees of freedom. In 2010 IEEE International Conference on Robotics and Automation, pages 2630–2636. IEEE, 2010.
  51. Informed machine learning–a taxonomy and survey of integrating knowledge into learning systems. arXiv preprint arXiv:1903.12394, 2019.
  52. Hybrid semi-parametric modeling in process systems engineering: Past, present and future. Computers & Chemical Engineering, 60:86–101, 2014.
  53. Physics-informed machine learning approach for reconstructing Reynolds stress modeling discrepancies based on DNS data. Physical Review Fluids, 2(3):034603, 2017.
  54. Integrating scientific knowledge with machine learning for engineering and environmental systems. ACM Computing Surveys, 55(4):1–37, 2022.
  55. Gaussian processes for machine learning, volume 2. MIT press Cambridge, MA, 2006.
  56. Deep kernel learning. In Artificial intelligence and statistics, pages 370–378. PMLR, 2016.
  57. David H. Wolpert. Stacked generalization. Neural Networks, 5(2):241–259, 1992. ISSN 0893-6080. https://doi.org/10.1016/S0893-6080(05)80023-1. URL https://www.sciencedirect.com/science/article/pii/S0893608005800231.
  58. Data-driven methods to improve baseflow prediction of a regional groundwater model. Computers & Geosciences, 85:124–136, 2015.
  59. Augmenting physical models with deep networks for complex dynamics forecasting. Journal of Statistical Mechanics: Theory and Experiment, 2021(12):124012, 2021.
Citations (5)

Summary

We haven't generated a summary for this paper yet.