Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Training Physics-Informed Neural Networks via Multi-Task Optimization for Traffic Density Prediction (2307.03920v1)

Published 8 Jul 2023 in cs.NE and cs.LG

Abstract: Physics-informed neural networks (PINNs) are a newly emerging research frontier in machine learning, which incorporate certain physical laws that govern a given data set, e.g., those described by partial differential equations (PDEs), into the training of the neural network (NN) based on such a data set. In PINNs, the NN acts as the solution approximator for the PDE while the PDE acts as the prior knowledge to guide the NN training, leading to the desired generalization performance of the NN when facing the limited availability of training data. However, training PINNs is a non-trivial task largely due to the complexity of the loss composed of both NN and physical law parts. In this work, we propose a new PINN training framework based on the multi-task optimization (MTO) paradigm. Under this framework, multiple auxiliary tasks are created and solved together with the given (main) task, where the useful knowledge from solving one task is transferred in an adaptive mode to assist in solving some other tasks, aiming to uplift the performance of solving the main task. We implement the proposed framework and apply it to train the PINN for addressing the traffic density prediction problem. Experimental results demonstrate that our proposed training framework leads to significant performance improvement in comparison to the traditional way of training the PINN.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Bo Wang (823 papers)
  2. A. K. Qin (37 papers)
  3. Sajjad Shafiei (6 papers)
  4. Hussein Dia (4 papers)
  5. Adriana-Simona Mihaita (19 papers)
  6. Hanna Grzybowska (5 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.