Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Can Physics Informed Neural Operators Self Improve? (2311.13885v1)

Published 23 Nov 2023 in cs.LG, cs.AI, and math.AP

Abstract: Self-training techniques have shown remarkable value across many deep learning models and tasks. However, such techniques remain largely unexplored when considered in the context of learning fast solvers for systems of partial differential equations (Eg: Neural Operators). In this work, we explore the use of self-training for Fourier Neural Operators (FNO). Neural Operators emerged as a data driven technique, however, data from experiments or traditional solvers is not always readily available. Physics Informed Neural Operators (PINO) overcome this constraint by utilizing a physics loss for the training, however the accuracy of PINO trained without data does not match the performance obtained by training with data. In this work we show that self-training can be used to close this gap in performance. We examine canonical examples, namely the 1D-Burgers and 2D-Darcy PDEs, to showcase the efficacy of self-training. Specifically, FNOs, when trained exclusively with physics loss through self-training, approach 1.07x for Burgers and 1.02x for Darcy, compared to FNOs trained with both data and physics loss. Furthermore, we discover that pseudo-labels can be used for self-training without necessarily training to convergence in each iteration. A consequence of this is that we are able to discover self-training schedules that improve upon the baseline performance of PINO in terms of accuracy as well as time.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (19)
  1. Fourier neural operator for parametric partial differential equations, 2021.
  2. Wavelet neural operator: a neural operator for parametric partial differential equations. arXiv preprint arXiv:2205.02191, 2022.
  3. How important are specialized transforms in neural operators? arXiv preprint arXiv:2308.09293, 2023.
  4. Multiwavelet-based operator learning for differential equations. Advances in neural information processing systems, 34:24048–24062, 2021.
  5. Fourcastnet: A global data-driven high-resolution weather model using adaptive fourier neural operators. arXiv preprint arXiv:2202.11214, 2022.
  6. Learning deep implicit fourier neural operators (ifnos) with applications to heterogeneous material modeling. Computer Methods in Applied Mechanics and Engineering, 398:115296, 2022.
  7. Physics-informed neural operator for learning partial differential equations. arXiv preprint arXiv:2111.03794, 2021.
  8. Unsupervised self-training for sentiment analysis of code-switched data, 2021.
  9. Text classification using label names only: A language model self-training approach. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 9006–9017, Online, November 2020. Association for Computational Linguistics.
  10. Proceedings of the ACL 2010 Student Research Workshop, Uppsala, Sweden, July 2010. Association for Computational Linguistics.
  11. Dtg-ssod: Dense teacher guidance for semi-supervised object detection, 2022.
  12. Faxmatch: Multi-curriculum pseudo-labeling for semi-supervised medical image classification. Medical Physics, 50(5):3210–3222, 2023.
  13. V-dixmatch: A semi-supervised learning method for human action recognition in night video sensing. IEEE Sensors Journal, pages 1–1, 2023.
  14. Fgbcnn: A unified bilinear architecture for learning a fine-grained feature representation in facial expression recognition. Image and Vision Computing, 137:104770, 2023.
  15. wav2vec 2.0: A framework for self-supervised learning of speech representations, 2020.
  16. Joint speech transcription and translation: Pseudo-labeling with out-of-distribution data. In Findings of the Association for Computational Linguistics: ACL 2023, pages 7637–7650, Toronto, Canada, July 2023. Association for Computational Linguistics.
  17. Mist: Multiple instance self-training framework for video anomaly detection, 2021.
  18. Topological identification and interpretation for single-cell gene regulation elucidation across multiple platforms using scmgca. Nature communications., 14(1), 2023-12-25.
  19. St-pinn: A self-training physics-informed neural network for partial differential equations, 2023.

Summary

We haven't generated a summary for this paper yet.