2000 character limit reached
Astro-MoE: Mixture of Experts for Multiband Astronomical Time Series (2507.12611v1)
Published 16 Jul 2025 in astro-ph.IM
Abstract: Multiband astronomical time series exhibit heterogeneous variability patterns, sampling cadences, and signal characteristics across bands. Standard transformers apply shared parameters to all bands, potentially limiting their ability to model this rich structure. In this work, we introduce Astro-MoE, a foundational transformer architecture that enables dynamic processing via a Mixture of Experts module. We validate our model on both simulated (ELAsTiCC-1) and real-world datasets (Pan-STARRS1).
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.