2000 character limit reached
Functional mixture-of-experts for classification (2202.13934v1)
Published 28 Feb 2022 in stat.ML, cs.AI, and cs.LG
Abstract: We develop a mixtures-of-experts (ME) approach to the multiclass classification where the predictors are univariate functions. It consists of a ME model in which both the gating network and the experts network are constructed upon multinomial logistic activation functions with functional inputs. We perform a regularized maximum likelihood estimation in which the coefficient functions enjoy interpretable sparsity constraints on targeted derivatives. We develop an EM-Lasso like algorithm to compute the regularized MLE and evaluate the proposed approach on simulated and real data.
- Nhat Thien Pham (3 papers)
- Faicel Chamroukhi (35 papers)