BubbleONet: Neural Operator for Bubble Dynamics
- BubbleONet is a physics-informed neural operator that maps pressure profiles to bubble radius responses with high accuracy.
- It fuses a branch and trunk network design with physics-based ODE residuals to ensure solution consistency with bubble dynamic equations.
- The framework offers computational efficiency and robust performance, outperforming traditional ODE solvers in simulating high-frequency bubble dynamics.
BubbleONet is a physics-informed neural operator designed for mapping pressure profiles to bubble radius responses in high-frequency bubble dynamics. Building on the PI-DeepONet (Physics-Informed Deep Operator Network) framework, BubbleONet fuses universal operator learning capabilities with physics-based constraints, enabling accurate, physically consistent, and computationally efficient surrogate modeling of complex bubble oscillations governed by nonlinear ordinary differential equations (ODEs) such as the Rayleigh–Plesset and Keller–Miksis equations.
1. Architecture and Design Principles
BubbleONet comprises two principal components in alignment with DeepONet’s operator learning paradigm:
- Branch Network: Receives the full pressure profile as input. This component is implemented using a Kronecker Neural Network (KNN) structure augmented by the Rowdy adaptive activation function, addressing the spectral bias evident in deep networks by incorporating trainable sinusoidal elements (amplitude, frequency, phase) alongside standard ReLU activations.
- Trunk Network: Encodes low-dimensional features such as time and initial bubble radius into a latent representation via a standard feedforward neural network.
Prediction of the bubble radius is realized through an inner product of the latent representations (branch output) and (trunk output):
The output is then passed through a SmoothReLU activation to ensure physical positivity of bubble radii.
The model integrates physics constraints by adding an ODE-residual loss corresponding to the governing bubble dynamics equations, ensuring solution consistency with established physical laws.
2. Mathematical Foundations
The Rowdy activation function is critical in counteracting spectral bias, enabling the network to learn high-frequency response components essential in bubble dynamics driven by ultrasound and shock waves. The function adopts the structure:
where , is a scaling factor (set to 10), initializes to 0.1, and , , are trainable.
The total training loss combines a standard mean squared error on the data () with an ODE-residual loss (), derived by discretizing the relevant ODE (using a Runge–Kutta method). For example,
where contains the state , and are stage derivatives from the Runge–Kutta solution. Network parameters are updated to minimize
thereby penalizing deviations from both empirical data and physical ODE constraints.
3. Evaluation Regimes
Three canonical bubble dynamics problems were used to assess BubbleONet’s accuracy and generalization:
- Rayleigh–Plesset (R–P) Equation: A prototype model for bubble oscillations in incompressible liquids. BubbleONet was evaluated for a fixed initial radius (m) and a variety of driving pressure profiles (frequency, amplitude). Both time-domain and Fourier spectrum analyses confirmed accurate reproduction of natural and driving frequencies.
- Keller–Miksis (K–M) Equation: A refinement of R–P, modeling compressible effects critical at high frequencies and amplitudes. Performance was tested on cases with single and multiple initial bubble radii.
- Multiradius Scenarios: The network’s ability to generalize across a span of initial radii ( from m to m) was probed, confirming robust fidelity in spatial parameter extrapolation and interpolation.
BubbleONet demonstrated capability to capture both low- and high-frequency spectral features, critical for simulating realistic bubble oscillations in biomedical and engineering contexts.
4. Training Strategies and Optimization
Two principal training methodologies were examined:
- Single-Step Training: Joint optimization over both branch and trunk networks. The complete loss (data + ODE residual) is minimized in a single phase. While direct, this approach can encounter challenges with convergence, especially for high-frequency content representation.
- Two-Step Training: A staged approach wherein the trunk network is pre-trained, its output basis extracted and orthonormalized (using, for instance, singular value decomposition, SVD). The branch network is then trained with fixed trunk outputs. This strategy led to approximately 20% faster convergence and greater robustness, particularly in extrapolation scenarios.
Both strategies facilitate the practical deployment of BubbleONet as a surrogate model, with the two-step variant exhibiting improved optimization landscape characteristics.
5. Performance Analysis and Practical Implications
BubbleONet achieves high accuracy across tested regimes. For R–P dynamics, the model successfully resolves both fundamental and subharmonic frequencies. In high-frequency K–M cases (pressures oscillatory at 1.9 MHz), BubbleONet required extended training (up to 500,000 epochs) to precisely capture rapid oscillatory features; error decreased progressively with training duration.
Maximum absolute errors for multiradius cases reside in the range $0.1$–m, underscoring predictive stability. BubbleONet’s predictions are more sensitive to variation in driving frequency than amplitude, and periodic error structures were observed—plausibly originating from the sinusoidal aspects of the Rowdy activation.
The computational efficiency is substantial: inference via BubbleONet proceeds orders of magnitude faster than classical ODE solvers (necessary for stiff equations in cavitation modeling), positioning it for real-time simulation, parameter sweep analyses, and large-scale scenario design.
Applications suggested by these findings include real-time planning in ultrasound-mediated therapy, rapid assessment of cavitation erosion in engineering systems, and data-sparse predictive modeling in physical regimes governed by complex bubble oscillations.
6. Limitations and Future Directions
BubbleONet’s error tends to increase when extrapolating beyond the training domain in driving frequency and time. A plausible implication is that periodicities inherent in the Rowdy activation induce regular error patterns, particularly when physical regimes diverge markedly from seen training data.
Potential future research areas include optimizing activation functions for further spectral bias reduction, integrating additional physical parameters into the input space (temperature, viscosity, dissolved gas concentration), and extending the surrogate to coupled bubble populations or multidomain fluid–structure interaction models.
Enhanced architectures that incorporate long-term temporal dependencies or ensemble uncertainty quantification could further broaden BubbleONet’s applicability as a predictive and design tool in scientific and industrial bubble dynamics simulations.