Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 189 tok/s
Gemini 2.5 Pro 46 tok/s Pro
GPT-5 Medium 35 tok/s Pro
GPT-5 High 40 tok/s Pro
GPT-4o 103 tok/s Pro
Kimi K2 207 tok/s Pro
GPT OSS 120B 451 tok/s Pro
Claude Sonnet 4.5 38 tok/s Pro
2000 character limit reached

Space Weather Prediction Testbed

Updated 15 November 2025
  • Space Weather Prediction Testbed (SWPT) is an operational exercise that evaluates real-time SEP forecasting via advanced physics-based models.
  • The methodology integrates adaptive grid strategies and dual-run protocols to balance rapid predictions with high-fidelity outputs critical for human spaceflight.
  • Operational implications include interagency collaborations and workflow integration with agencies like NOAA and NASA, ensuring timely support for missions such as Artemis II.

The Space Weather Prediction Testbed (SWPT) is a comprehensive operational exercise designed to critically evaluate the real-time forecasting capabilities of advanced physics-based space weather models in a near-operational environment. In May 2025, the SWPT focused on the SOlar wind with FIeld lines and Energetic particles (SOFIE) model developed by the CLEAR Space Weather Center of Excellence, testing its ability to predict solar energetic particle (SEP) fluxes in support of human spaceflight, particularly for Artemis II mission scenarios. The exercise involved live, on-site interagency collaboration at the National Oceanic and Atmospheric Administration’s Space Weather Prediction Center (NOAA/SWPC), with emphasis on computational performance, model accuracy, workflow integration, and forecasting utility in simulated real-time conditions (Liu et al., 12 Nov 2025).

1. Objectives and Organizational Structure

The 2025 SWPT exercise was conducted at NOAA/SWPC (Boulder, CO) with coordination from multiple agencies and research centers:

  • Organizers: NOAA/SWPC, NASA’s Community Coordinated Modeling Center (CCMC), Space Radiation Analysis Group (SRAG), Moon-to-Mars Space Weather Analysis Office (M2M SWAO), and the CLEAR Center.
  • Participants: SWPC forecasters (flare detection, CME analysis), M2M SWAO analysts (CME parameter extraction), CCMC (model support), SRAG (dose response), CLEAR team (SOFIE operations), and external observers.

Key objectives were:

  • Assess SOFIE’s ability to deliver real-time or faster predictions of SEP fluxes relevant to astronaut safety in Orion-class vehicles.
  • Evaluate both technical metrics (latency, computational efficiency, predictive accuracy) and operational integration aspects (fit within forecaster and console-operator workflows, analyst coordination).
  • Test response using two historical, well-characterized SEP events:

    1. 10 September 2017, X8.2 flare and fast CME (plane-of-sky speed ∼2650 km/s).
    2. 4 November 2001, X1.0 flare and fast-halo CME (∼1925 km/s).
  • Practice simulated astronaut-dose computation and response protocols for human spaceflight support.

2. SOFIE Model Architecture and Governing Physics

SOFIE is implemented within the Space Weather Modeling Framework (SWMF) with three primary modules:

  • AWSoM-R: Stream-aligned magnetohydrodynamics (MHD), providing dynamic ambient solar wind solutions.
  • EEGGL: Gibson–Low flux-rope generator for CME introduction.
  • M-FLAMPA: Multiple Field-Line Advection Model for Particle Acceleration, solving the focused transport equation for SEP injection and propagation.

2.1 MHD Physics (AWSoM-R)

  • Mass continuity:

ρt+(ρU)=0\frac{\partial \rho}{\partial t} + \nabla\cdot(\rho\,\mathbf{U}) = 0

  • Momentum equation (includes Lorentz force, gravity):

(ρU)t+[ρUU+pI]=J×B+ρg\frac{\partial (\rho \mathbf{U})}{\partial t} + \nabla\cdot[\rho\,\mathbf{U}\mathbf{U} + p\,\mathbf{I}] = \mathbf{J}\times\mathbf{B} + \rho\,\mathbf{g}

  • Energy equation incorporates Alfvén-wave heating and thermal conduction.

2.2 SEP Transport and Acceleration (M-FLAMPA)

  • Focused transport equation on magnetic field lines:

ft+μvfs+1μ22L(s)vfμ=μ[Dμμ(s,p)fμ]+Qinj(s,p,t)\frac{\partial f}{\partial t} + \mu v\,\frac{\partial f}{\partial s} + \frac{1-\mu^2}{2L(s)}\,v\,\frac{\partial f}{\partial \mu} = \frac{\partial}{\partial \mu}\left[D_{\mu\mu}(s,p)\frac{\partial f}{\partial \mu}\right] + Q_\mathrm{inj}(s,p,t)

where ss is field line distance, μ\mu is particle pitch cosine, L(s)L(s) the focusing length, and QinjQ_\mathrm{inj} the shock-driven injection.

  • Injection at CME shock parameterized as:

Qinj(s,p,t)=S0δ[sssh(t)](pp0)qQ_\mathrm{inj}(s,p,t) = S_0\,\delta[s-s_\mathrm{sh}(t)]\,\left(\frac{p}{p_0}\right)^{-q}

with S0S_0 as a scaling factor, qq the spectral index, and p0p_0 a reference momentum.

3. Simulation Domain and Grid Optimization

The spatial domain uses a two-part approach:

  • Spherical shell (SC): 1.1–24 RR_\odot
  • Cartesian heliosphere (IH): 20–650 RR_\odot (with overlap buffer)

To manage computational cost without degrading predictive performance, the grid strategy employed a coarse background mesh with block-adaptive mesh refinement (AMR) concentrated on:

  • The heliospheric current sheet (HCS)
  • Earth-directed cones (±\pm15° of the Sun-Earth line)
  • CME propagation/shock regions

Three domain setups were evaluated:

  1. Setup 1: Fine default grid (angular Δϕ=0.7\Delta\phi = 0.72.82.8^\circ) with AMR.
  2. Setup 2: Coarsened background grid by ×2\times2 ($1.4$–5.65.6^\circ angular cells), retaining AMR.
  3. Setup 3: Same coarse background as Setup 2, but with default (fine) resolution in AMR-refined regions.

The corresponding grid cost scaling leverages the AMR approach:

Neff(LΔxbg)3+α(LrefΔxref)3N_{\rm eff} \approx \left(\frac{L}{\Delta x_{\rm bg}}\right)^3 + \alpha\left(\frac{L_{\rm ref}}{\Delta x_{\rm ref}}\right)^3

where Δxbg\Delta x_{\rm bg} is background spacing, Δxref\Delta x_{\rm ref} fine spacing, and α\alpha the refined volume fraction. Halving Δxbg\Delta x_{\rm bg} reduces cost by 8×\sim8\times for the SC.

4. Technical Performance and Accuracy Results

SOFIE simulations were run on the NASA Pleiades supercomputer with 1,000 CPU cores (25 Cascade Lake nodes).

4.1 Wall-Clock vs. Simulated Time

  • Setup 2 completed 4-day SEP predictions in 4.86 hours; it achieved real-time catchup within 1.19 hours after CME input (latency ≈0.19 hr).
  • Setup 3 required 18.6 hours to complete the run (catchup in 4.11 hr).
  • Setup 1 (highest resolution) took 21.1 hours (catchup in 10.87 hr).
  • All configurations successfully delivered simulation results significantly faster than real event duration for Setup 2, which is critical for operational forecasting.

4.2 Predictive Accuracy

Accuracy Metrics (4 November 2001 Event):

Energy Channel Setup Spearman ρ % within 10× obs. Median log-error Total Runtime (hr)
>10 MeV 1 0.93 99.9% −0.18 21.1
2 0.84 92.7% +0.05 4.86
3 0.87 84.7% −0.04 18.6
>100 MeV 1 0.92 99.6% +0.34 21.1
2 0.56 92.1% +0.13 4.86
3 0.57 97.8% +0.21 18.6

Setup 1 maximizes correlation and minimizes bias, but with greater latency. Setup 2 trades modest loss in accuracy (with order-of-magnitude agreement >92%>92\%) for 4×\sim4\times faster wall-clock performance, meeting operational needs for rapid guidance.

A plausible implication is that adaptive grid strategies allow physics-based SEP models to meet both latency and accuracy requirements, with Setup 2 providing timely forecaster support and Setup 1 providing refined guidance.

5. Integration with Operational Workflows and Model Refinement

Explicit interactive feedback was solicited throughout the exercise with operational forecasters, console operators, and analysts. Key workflow adaptations included:

  • “Fast-first, accurate-later” dual-run protocol: Launch Setup 2 immediately upon CME detection for rapid, order-of-magnitude predictions (within 5 hr). Setup 1 runs in parallel, providing higher-fidelity forecasts for mission planning (completed in \sim21 hr).
  • Hybrid grid proposal (Setup 3): Coarse background grid but full-resolution AMR in critical regions (HCS and CME-Earth cone) to balance time-to-solution and forecast fidelity.
  • Daily precomputation of the ambient solar wind background; only CME launch triggers a SOFIE restart, minimizing operational latency (\sim1 hr).

This collaborative model-operations loop strengthened procedural realism and responsiveness, aligning advanced model outputs with practical human spaceflight support needs.

6. Implications for Forecasting, Future Exercises, and Model Evolution

The SWPT exercise demonstrated that physics-based SEP prediction models, traditionally seen as too computationally expensive for operations, can achieve real-time (even faster than real-time) and accurate forecasts when optimized with adaptive gridding and workflow integration.

Significant implications include:

  • “First-fast, then fine” protocol ensures rapid situational awareness and subsequent high-fidelity mission support, mitigating operational risk for human exploration (e.g., Artemis II).
  • Model physics refinements are encouraged, such as replacing ad-hoc SEP injection with physically motivated suprathermal seed populations and empirically tuned mean free path estimates.
  • Future SWPTs will likely incorporate ensemble (multi-event, multi-model) runs, include more realistic background solar wind (with prior CMEs), and enable validation with multi-spacecraft data (e.g., Parker Solar Probe, Solar Orbiter).
  • Pipeline integration: seamless transitions from automated CME detection → real-time SOFIE simulation → product dissemination to SWPC/SRAG → iterative feedback to modelers.

A plausible implication is that operational adoption of these approaches will set a precedent for the broader use of computationally intensive, physically rigorous space weather models in mission-critical forecasting pipelines.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (1)
Forward Email Streamline Icon: https://streamlinehq.com

Follow Topic

Get notified by email when new papers are published related to Space Weather Prediction Testbed (SWPT).