Physics-Informed Neural Networks for Satellite State Estimation (2403.19736v1)
Abstract: The Space Domain Awareness (SDA) community routinely tracks satellites in orbit by fitting an orbital state to observations made by the Space Surveillance Network (SSN). In order to fit such orbits, an accurate model of the forces that are acting on the satellite is required. Over the past several decades, high-quality, physics-based models have been developed for satellite state estimation and propagation. These models are exceedingly good at estimating and propagating orbital states for non-maneuvering satellites; however, there are several classes of anomalous accelerations that a satellite might experience which are not well-modeled, such as satellites that use low-thrust electric propulsion to modify their orbit. Physics-Informed Neural Networks (PINNs) are a valuable tool for these classes of satellites as they combine physics models with Deep Neural Networks (DNNs), which are highly expressive and versatile function approximators. By combining a physics model with a DNN, the machine learning model need not learn astrodynamics, which results in more efficient and effective utilization of machine learning resources. This paper details the application of PINNs to estimate the orbital state and a continuous, low-amplitude anomalous acceleration profile for satellites. The PINN is trained to learn the unknown acceleration by minimizing the mean square error of observations. We evaluate the performance of pure physics models with PINNs in terms of their observation residuals and their propagation accuracy beyond the fit span of the observations. For a two-day simulation of a GEO satellite using an unmodeled acceleration profile on the order of $10{-8} \text{ km/s}2$, the PINN outperformed the best-fit physics model by orders of magnitude for both observation residuals (123 arcsec vs 1.00 arcsec) as well as propagation accuracy (3860 km vs 164 km after five days).
- [Online]. Available: https://www.jtf-spacedefense.mil/About-Us/Fact-Sheets/Display/Article/3155799/18th-space-defense-squadron/
- U. S. W. House, “Space policy directive-3, national space traffic management policy,” Jun 2018. [Online]. Available: https://rosap.ntl.bts.gov/view/dot/60966
- L. C. G. Shepherd and A. F. S. Command, “Space surveillance network,” in Shared Space Situational Awareness Conference, Colorado Sprincs, CO, 2006.
- D. Vallado, P. Crawford, R. Hujsak, and T. Kelso, “Revisiting spacetrack report# 3,” in AIAA/AAS Astrodynamics Specialist Conference and Exhibit, 2006, p. 6753.
- D. Thomas, “A comparison of geo satellites using chemical and electric propulsion,” 2016.
- L. von Rueden, S. Mayer, K. Beckh, B. Georgiev, S. Giesselbach, R. Heese, B. Kirsch, J. Pfrommer, A. Pick, R. Ramamurthy, M. Walczak, J. Garcke, C. Bauckhage, and J. Schuecker, “Informed machine learning – a taxonomy and survey of integrating prior knowledge into learning systems,” IEEE Transactions on Knowledge and Data Engineering, vol. 35, no. 1, pp. 614–633, 2023.
- P. Wegner, J. Ganley, and J. Maly, “Eelv secondary payload adapter (espa): providing increased access to space,” in 2001 IEEE Aerospace Conference Proceedings (Cat. No.01TH8542), vol. 5, 2001, pp. 2563–2568 vol.5.
- D. M. Goebel, J. E. Polk, I. Sandler, I. G. Mikellides, J. R. Brophy, W. G. Tighe, and K.-R. Chien, “Evaluation of 25-cm xips thruster life for deep space mission applications,” in 31st International Electric Propulsion Conference (20–24 September 2009, 2009.
- R. T. Q. Chen, Y. Rubanova, J. Bettencourt, and D. Duvenaud, “Neural ordinary differential equations,” Advances in Neural Information Processing Systems, 2018.
- R. T. Q. Chen, “torchdiffeq,” 2018. [Online]. Available: https://github.com/rtqichen/torchdiffeq
- C. Rackauckas, M. Innes, Y. Ma, J. Bettencourt, L. White, and V. Dixit, “Diffeqflux.jl-a julia library for neural differential equations,” arXiv preprint arXiv:1902.02376, 2019.
- C. Rackauckas, Y. Ma, J. Martensen, C. Warner, K. Zubov, R. Supekar, D. Skinner, and A. Ramadhan, “Universal differential equations for scientific machine learning,” arXiv preprint arXiv:2001.04385, 2020.
- A. Paszke, S. Gross, F. Massa, A. Lerer, J. Bradbury, G. Chanan, T. Killeen, Z. Lin, N. Gimelshein, L. Antiga, A. Desmaison, A. Köpf, E. Yang, Z. DeVito, M. Raison, A. Tejani, S. Chilamkurthy, B. Steiner, L. Fang, J. Bai, and S. Chintala, “Pytorch: An imperative style, high-performance deep learning library,” 2019.
- F. Murtagh, “Multilayer perceptrons for classification and regression,” Neurocomputing, vol. 2, no. 5, pp. 183–197, 1991.
- D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” arXiv preprint arXiv:1412.6980, 2014.