Scalable algorithms for DSEMs with ordinal responses

Develop scalable inference algorithms for estimating dynamic structural equation models (DSEMs) with ordinal responses that remain efficient in intensive longitudinal data settings and avoid the inefficiencies introduced by threshold constraints in latent-response probit formulations without relying on manually tuned Metropolis proposals.

Background

The paper introduces a hybrid NUTS–Gibbs sampler with state space marginalization that is efficient for binomial responses but becomes inefficient for ordinal outcomes under the common probit latent-response formulation. In the ordinal probit case, threshold parameters must lie between the maximum latent response for category c and the minimum latent response for category c+1. With many observations, these feasible intervals become extremely narrow, causing the NUTS step to take very small steps and reducing efficiency substantially.

Although pure NUTS can avoid the latent-response bottleneck and performed well in the authors’ experiments, the hybrid approach—central to leveraging Kalman filter marginalization—remains inefficient for ordinal data. Joint Metropolis updates of thresholds and latent responses would require manual tuning, which hinders automated, scalable inference. Consequently, the development of scalable algorithms for DSEMs with ordinal responses remains unresolved.

References

We conclude that developing scalable algorithms for DSEMs with ordinal data remains an open challenge.