Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
8 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Bayesian Learning in a Nonlinear Multiscale State-Space Model (2408.06425v6)

Published 12 Aug 2024 in eess.SP, cs.LG, and stat.ML

Abstract: The ubiquity of multiscale interactions in complex systems is well-recognized, with development and heredity serving as a prime example of how processes at different temporal scales influence one another. This work introduces a novel multiscale state-space model to explore the dynamic interplay between systems interacting across different time scales, with feedback between each scale. We propose a Bayesian learning framework to estimate unknown states by learning the unknown process noise covariances within this multiscale model. We develop a Particle Gibbs with Ancestor Sampling (PGAS) algorithm for inference and demonstrate through simulations the efficacy of our approach.

Citations (1)

Summary

  • The paper introduces a Bayesian framework leveraging PGAS for effective state and parameter estimation in a multiscale nonlinear state-space model.
  • It integrates conjugate priors and inverse-Wishart distributions to robustly estimate fine and coarse scale process noise covariances.
  • Simulation results demonstrate low RMSE and accurate latent trajectory recovery, underscoring its potential in biological and complex systems modeling.

Bayesian Learning in a Multiscale Nonlinear State-Space Model

Introduction

The paper "Bayesian Learning in a Multiscale Nonlinear State-Space Model," authored by Nayely Velez-Cruz and Manfred D. Laubichler from the School of Complex Systems at Arizona State University, introduces an innovative approach to modeling dynamic interactions between developmental and hereditary processes across different temporal scales using Bayesian learning. The researchers specifically aim to estimate unknown states and process noise covariances within a multiscale model by employing a Particle Gibbs with Ancestor Sampling (PGAS) algorithm.

Multiscale State-Space Model

The model proposed in this work is structured to handle two distinct time scales: a fine scale representing individual developmental stages and a coarse scale representing hereditary traits across generations. The latent fine-scale states xd,kt\mathbf{x}^t_{d,k} and coarse-scale states X~d,t\tilde{\mathbf{X}}_{d,t} adhere to state transition and measurement equations incorporating Gaussian noise terms. Fine-scale process noise is denoted as wd,ktN(0,Σf)\mathbf{w}^t_{d,k} \sim \mathcal{N}(\mathbf{0}, \Sigma_f), while coarse-scale process noise is represented by Wd,tN(0,Σc,d)\mathbf{W}_{d,t} \sim \mathcal{N}(\mathbf{0}, \Sigma_{c,d}).

Bayesian Hierarchical Model and Inference

The Bayesian hierarchical model is designed with conjugate priors, assuming inverse-Wishart distributions for the process noise covariances, ensuring the feasibility of closed-form posterior computations. The inference process hinges on the PGAS algorithm, blending particle filtering with ancestor sampling and Gibbs sampling to achieve effective state and parameter estimation.

Particle Gibbs with Ancestor Sampling (PGAS)

The PGAS algorithm is pivotal for state and parameter estimation, iteratively sampling states and parameters. The algorithm operates by:

  1. Sampling state trajectories for the fine time scale given measurements and coarse-scale states.
  2. Sampling coarse-scale trajectories given measurements and fine-scale trajectories.
  3. Sampling fine-scale process covariance given fine-scale states and measurements.
  4. Sampling coarse-scale process covariance given coarse-scale states and measurements.

Simulation Settings

The simulations carried out employ predefined parameters for fine and coarse scales. Specific settings include:

  • Fine-scale process noise covariance Σftrue=0.2×INx\Sigma_{f}^{\text{true}} = 0.2 \times \mathbf{I}_{N_x}
  • Coarse-scale process noise covariances defined individually for each subject dd
  • Inverse-Wishart distributions for process noise covariances

The PGAS algorithm utilizes 500 particles and runs for 10,000 iterations, discarding the initial 10% as burn-in.

Results and Discussion

The algorithm's performance was validated through simulations, revealing that PGAS effectively estimates both fine and coarse scale trajectories and accurately learns the process noise covariances. The results underscore a low RMSE across most individuals and dimensions, indicating the model's robustness. The simulation results (illustrated in figures and tables) exhibit strong congruence between true and estimated states both at fine and coarse scales, demonstrating the practical viability of the proposed multiscale approach.

Implications and Future Work

The presented work offers valuable insights into the dynamics of developmental and hereditary processes, potentially benefiting various biological applications such as gene regulatory network inference and ecological modeling. Future developments could involve refining the PGAS algorithm, optimizing model parameters, and extending the application to other multiscale complex systems. Exploring alternative priors and improving computational efficiency could also enhance the model's applicability in real-world scenarios.

In conclusion, this paper represents a significant contribution to multiscale modeling by effectively integrating Bayesian learning with a nonlinear state-space framework, thereby opening new avenues for understanding and predicting complex evolutionary and developmental dynamics.