Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
120 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Bayesian Semiparametric Orthogonal Tucker Factorized Mixed Models for Multi-dimensional Longitudinal Functional Data (2506.16668v1)

Published 20 Jun 2025 in stat.ME

Abstract: We introduce a novel longitudinal mixed model for analyzing complex multidimensional functional data, addressing challenges such as high-resolution, structural complexities, and computational demands. Our approach integrates dimension reduction techniques, including basis function representation and Tucker tensor decomposition, to model complex functional (e.g., spatial and temporal) variations, group differences, and individual heterogeneity while drastically reducing model dimensions. The model accommodates multiplicative random effects whose marginalization yields a novel Tucker-decomposed covariance-tensor framework. To ensure scalability, we employ semi-orthogonal mode matrices implemented via a novel graph-Laplacian-based smoothness prior with low-rank approximation, leading to an efficient posterior sampling method. A cumulative shrinkage strategy promotes sparsity and enables semiautomated rank selection. We establish theoretical guarantees for posterior convergence and demonstrate the method's effectiveness through simulations, showing significant improvements over existing techniques. Applying the method to Alzheimer's Disease Neuroimaging Initiative (ADNI) neuroimaging data reveals novel insights into local brain changes associated with disease progression, highlighting the method's practical utility for studying cognitive decline and neurodegenerative conditions.

Summary

We haven't generated a summary for this paper yet.