Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Functional mixture-of-experts for classification (2202.13934v1)

Published 28 Feb 2022 in stat.ML, cs.AI, and cs.LG

Abstract: We develop a mixtures-of-experts (ME) approach to the multiclass classification where the predictors are univariate functions. It consists of a ME model in which both the gating network and the experts network are constructed upon multinomial logistic activation functions with functional inputs. We perform a regularized maximum likelihood estimation in which the coefficient functions enjoy interpretable sparsity constraints on targeted derivatives. We develop an EM-Lasso like algorithm to compute the regularized MLE and evaluate the proposed approach on simulated and real data.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Nhat Thien Pham (3 papers)
  2. Faicel Chamroukhi (35 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.