Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Path Signatures and Graph Neural Networks for Slow Earthquake Analysis: Better Together? (2402.03558v1)

Published 5 Feb 2024 in cs.LG and physics.geo-ph

Abstract: The path signature, having enjoyed recent success in the machine learning community, is a theoretically-driven method for engineering features from irregular paths. On the other hand, graph neural networks (GNN), neural architectures for processing data on graphs, excel on tasks with irregular domains, such as sensor networks. In this paper, we introduce a novel approach, Path Signature Graph Convolutional Neural Networks (PS-GCNN), integrating path signatures into graph convolutional neural networks (GCNN), and leveraging the strengths of both path signatures, for feature extraction, and GCNNs, for handling spatial interactions. We apply our method to analyze slow earthquake sequences, also called slow slip events (SSE), utilizing data from GPS timeseries, with a case study on a GPS sensor network on the east coast of New Zealand's north island. We also establish benchmarks for our method on simulated stochastic differential equations, which model similar reaction-diffusion phenomenon. Our methodology shows promise for future advancement in earthquake prediction and sensor network analysis.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (50)
  1. Local lyapunov exponents computed from observed data. Journal of Nonlinear Science, 2:343–365, 1992.
  2. I. P. Arribas. Derivatives pricing using signature payoffs. arXiv preprint arXiv:1809.09466, 2018.
  3. D. S. Bassett and O. Sporns. Network neuroscience. Nature neuroscience, 20(3):353–364, 2017.
  4. M. Belkin and P. Niyogi. Laplacian eigenmaps for dimensionality reduction and data representation. Neural computation, 15(6):1373–1396, 2003.
  5. K.-T. Chen. Integration of paths, geometric invariants and a generalized baker-hausdorff formula. Annals of Mathematics, 65(1):163–178, 1957.
  6. I. Chevyrev and A. Kormilitzin. A primer on the signature method in machine learning. arXiv preprint arXiv:1603.03788, 2016.
  7. Time series feature extraction on basis of scalable hypothesis tests (tsfresh–a python package). Neurocomputing, 307:72–77, 2018.
  8. Scalable spatiotemporal graph neural networks. In Proceedings of the AAAI conference on artificial intelligence, volume 37, pages 7218–7226, 2023.
  9. Convolutional neural networks on graphs with fast localized spectral filtering. Advances in neural information processing systems, 29, 2016.
  10. G. Dong and H. Liu. Feature engineering for machine learning and data analytics. CRC press, 2018.
  11. Backbones-review: Feature extraction networks for deep learning and deep reinforcement learning approaches. arXiv preprint arXiv:2206.08016, 2022.
  12. A. Fermanian. Embedding and learning with signatures. Computational Statistics & Data Analysis, 157:107148, 2021.
  13. The insertion method to invert the signature of a path. arXiv preprint arXiv:2304.01862, 2023.
  14. R. FitzHugh. Mathematical models of threshold phenomena in the nerve membrane. The bulletin of mathematical biophysics, 17:257–278, 1955.
  15. Discretely sampled signals and the rough hoff process. Stochastic Processes and their Applications, 126(9):2593–2614, 2016.
  16. B. D. Fulcher. Feature-based time-series analysis. In Feature engineering for machine learning and data analytics, pages 87–116. CRC press, 2018.
  17. Stability properties of graph neural networks. IEEE Transactions on Signal Processing, 68:5680–5695, 2020.
  18. Extracting information from the signature of a financial data stream. arXiv preprint arXiv:1307.7244, 2013.
  19. Space-time graph neural networks. In International Conference on Learning Representations, 2021.
  20. B. Hambly and T. Lyons. Uniqueness for the signature of a path of bounded variation and the reduced path group. Annals of Mathematics, pages 109–167, 2010.
  21. A survey on graph neural networks for time series: Forecasting, classification, imputation, and anomaly detection. arXiv preprint arXiv:2307.03759, 2023.
  22. H. Kantz and T. Schreiber. Nonlinear time series analysis, volume 7. Cambridge university press, 2004.
  23. R. Khodayi-Mehr and M. Zavlanos. Varnet: Variational neural networks for the solution of partial differential equations. In Learning for Dynamics and Control, pages 298–307. PMLR, 2020.
  24. R. Khodayi-mehr and M. M. Zavlanos. Deep learning for robotic mass transport cloaking. IEEE Transactions on Robotics, 36(3):967–974, 2020.
  25. Physics-guided active learning of environmental flow fields. In Learning for Dynamics and Control Conference, pages 928–940. PMLR, 2023.
  26. P. Kidger and T. Lyons. Signatory: differentiable computations of the signature and logsignature transforms, on both cpu and gpu. In International Conference on Learning Representations, 2020.
  27. Deep signature transforms. Advances in Neural Information Processing Systems, 32, 2019.
  28. T. N. Kipf and M. Welling. Semi-supervised classification with graph convolutional networks. In International Conference on Learning Representations, 2016.
  29. Surrogate data for hypothesis testing of physical systems. Physics Reports, 748:1–60, 2018.
  30. D. Lee and R. Ghrist. Path signatures on lie groups. arXiv preprint arXiv:2007.06633, 2020.
  31. A. J. Lotka. Contribution to the theory of periodic reactions. The Journal of Physical Chemistry, 14(3):271–274, 1910.
  32. T. Lyons and A. D. McLeod. Signature methods in machine learning. arXiv preprint arXiv:2206.14674, 2022.
  33. T. J. Lyons. Differential equations driven by rough signals. Revista Matemática Iberoamericana, 14(2):215–310, 1998.
  34. A survey on deep neural network compression: Challenges, overview, and solutions. arXiv preprint arXiv:2010.03954, 2020.
  35. K. Obara. Nonvolcanic deep tremor associated with subduction in southwest japan. Science, 296(5573):1679–1681, 2002.
  36. Graph signal processing: Overview, challenges, and applications. Proceedings of the IEEE, 106(5):808–828, 2018.
  37. Learning paths from signature tensors. SIAM Journal on Matrix Analysis and Applications, 40(2):394–416, 2019.
  38. Thermo‐poro‐mechanics of chemically active creeping faults: 3. the role of serpentinite in episodic tremor and slip sequences, and transition to chaos. Journal of Geophysical Research: Solid Earth, 119(6):4606–4625, 2014.
  39. M. Raissi and G. E. Karniadakis. Hidden physics models: Machine learning of nonlinear partial differential equations. Journal of Computational Physics, 357:125–141, 2018.
  40. Gated graph recurrent neural networks. IEEE Transactions on Signal Processing, 68:6303–6318, 2020.
  41. Graph neural networks: Architectures, stability, and transferability. Proceedings of the IEEE, 109(5):660–682, 2021.
  42. Learning to simulate complex physics with graph networks. In International conference on machine learning, pages 8459–8468. PMLR, 2020.
  43. A. Singer and H.-T. Wu. Vector diffusion maps and the connection laplacian. Communications on Pure and Applied Mathematics, 65(8):1067–1144, 2012.
  44. Slow slip events in new zealand: Irregular, yet predictable?, 2023.
  45. D. Vere-Jones. Earthquake occurrence and mechanisms, stochastic models for. In Extreme Environmental Events. 2011.
  46. L. M. Wallace. Slow slip events in new zealand. Annual Review of Earth and Planetary Sciences, 48:175–203, 2020.
  47. Integrating physics-based modeling with machine learning: A survey. arXiv preprint arXiv:2003.04919, 1(1):1–34, 2020.
  48. A comprehensive survey on graph neural networks. IEEE transactions on neural networks and learning systems, 32(1):4–24, 2020.
  49. Self-supervised learning of graph neural networks: A unified review. IEEE Transactions on Pattern Analysis and Machine Intelligence, 45(2):2412–2429, 2023.
  50. Developing the path signature methodology and its application to landmark-based human action recognition. In Stochastic Analysis, Filtering, and Stochastic Optimization: A Commemorative Volume to Honor Mark HA Davis’s Contributions, pages 431–464. Springer, 2022.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets