Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Sampling-based Distributed Training with Message Passing Neural Network (2402.15106v4)

Published 23 Feb 2024 in cs.LG, cs.DC, and physics.flu-dyn

Abstract: In this study, we introduce a domain-decomposition-based distributed training and inference approach for message-passing neural networks (MPNN). Our objective is to address the challenge of scaling edge-based graph neural networks as the number of nodes increases. Through our distributed training approach, coupled with Nystr\"om-approximation sampling techniques, we present a scalable graph neural network, referred to as DS-MPNN (D and S standing for distributed and sampled, respectively), capable of scaling up to $O(105)$ nodes. We validate our sampling and distributed training approach on two cases: (a) a Darcy flow dataset and (b) steady RANS simulations of 2-D airfoils, providing comparisons with both single-GPU implementation and node-based graph convolution networks (GCNs). The DS-MPNN model demonstrates comparable accuracy to single-GPU implementation, can accommodate a significantly larger number of nodes compared to the single-GPU variant (S-MPNN), and significantly outperforms the node-based GCN.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (24)
  1. AirfRANS: High fidelity computational fluid dynamics dataset for approximating reynolds-averaged navier–stokes solutions. In Thirty-sixth Conference on Neural Information Processing Systems Datasets and Benchmarks Track, 2022a. URL https://arxiv.org/abs/2212.07564.
  2. An extensible benchmarking graph-mesh dataset for studying steady-state incompressible navier-stokes equations. In ICLR 2022 Workshop on Geometrical and Topological Representation Learning, 2022b. URL https://openreview.net/forum?id=rqUUi4-kpeq.
  3. Convolutional networks on graphs for learning molecular fingerprints. Advances in neural information processing systems, 28, 2015.
  4. Fast graph representation learning with pytorch geometric. arXiv preprint arXiv:1903.02428, 2019.
  5. Protein interface prediction using graph convolutional networks. Advances in neural information processing systems, 30, 2017.
  6. Neural message passing for quantum chemistry. In International conference on machine learning, pp.  1263–1272. PMLR, 2017.
  7. Inductive representation learning on large graphs. Advances in neural information processing systems, 30, 2017.
  8. Semi-supervised classification with graph convolutional networks. 2017.
  9. Graphcast: Learning skillful medium-range global weather forecasting. arXiv preprint arXiv:2212.12794, 2022.
  10. Neural operator: Graph kernel network for partial differential equations. arXiv preprint arXiv:2003.03485, 2020a.
  11. Fourier neural operator for parametric partial differential equations. In International Conference on Learning Representations, 2020b.
  12. Learning nonlinear operators via deeponet based on the universal approximation theorem of operators. Nature machine intelligence, 3(3):218–229, 2021.
  13. Mavriplis, D. Unstructured grid techniques. Annual Review of Fluid Mechanics, 29(1):473–514, 1997.
  14. Pytorch: An imperative style, high-performance deep learning library. Advances in neural information processing systems, 32, 2019.
  15. Learning mesh-based simulation with graph networks. arXiv preprint arXiv:2010.03409, 2020.
  16. Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational physics, 378:686–707, 2019.
  17. A composable machine-learning approach for steady-state simulations on high-resolution grids. Advances in Neural Information Processing Systems, 35:17386–17401, 2022.
  18. Phycrnet: Physics-informed convolutional-recurrent network for solving spatiotemporal pdes. Computer Methods in Applied Mechanics and Engineering, 389:114399, 2022.
  19. Learning to simulate complex physics with graph networks. In International conference on machine learning, pp.  8459–8468. PMLR, 2020.
  20. Dynamic edge-conditioned filters in convolutional neural networks on graphs. In Proceedings of the IEEE conference on computer vision and pattern recognition, pp.  3693–3702, 2017.
  21. Super-convergence: Very fast training of neural networks using large learning rates. In Artificial intelligence and machine learning for multi-domain operations applications, volume 11006, pp.  369–386. SPIE, 2019.
  22. Multi-gpu approach for training of graph ml models on large cfd meshes. In AIAA SCITECH 2023 Forum, pp.  1203, 2023.
  23. Graph attention networks. In International Conference on Learning Representations, 2018.
  24. Physics-constrained deep learning for high-dimensional surrogate modeling and uncertainty quantification without labeled data. Journal of Computational Physics, 394:56–81, 2019.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com