Mixed Effects Mixture of Experts: Modeling Double Heterogeneous Trajectories
Abstract: Linear mixed-effects model (LMM) is a cornerstone of longitudinal data analysis, but is limited to adeptly make heterogeneous analyses predictable under both group-specific fixed effects and subject-specific random effects. To address this challenge, we propose a novel statistical framework by using a large model prototype: a mixed effects mixture of experts model (MEMoE). This framework integrates the divide-and-conquer paradigm of Mixture of Experts Models with classical mixed-effect modeling. In the proposed MEMoE, each expert is a full LMM dedicated to capturing the longitudinal trajectory of a specific latent subpopulation, while another model gating function learns to route subjects to the most appropriate expert in a data-driven manner based on baseline covariates. We develop a robust inferential procedure for parameter estimation based on the Laplace Expectation-Maximization algorithm, with standard errors calibrated using robust sandwich estimators to account for potential model misspecification. Extensive simulation studies and an empirical application demonstrate that MEMoE outperforms both traditional single-population LMM and conventional Mixture of Experts models in terms of parameter recovery, classification accuracy, and overall model fit.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.