Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Multilinear Discriminant Analysis using a new family of tensor-tensor products (2203.00967v1)

Published 2 Mar 2022 in math.NA and cs.NA

Abstract: Multilinear Discriminant Analysis (MDA) is a powerful dimension reduction method specifically formulated to deal with tensor data. Precisely, the goal of MDA is to find mode-specific projections that optimally separate tensor data from different classes. However, to solve this task, standard MDA methods use alternating optimization heuristics involving the computation of a succession of tensor-matrix products. Such approaches are most of the time difficult to solve and not natural, highligthing the difficulty to formulate this problem in fully tensor form. In this paper, we propose to solve multilinear discriminant analysis (MDA) by using the concept of transform domain (TD) recently proposed in \cite{Kilmer2011}. We show here that moving MDA to this specific transform domain make its resolution easier and more natural. More precisely, each frontal face of the transformed tensor is processed independently to build a separate optimization sub-problems easier to solve. Next, the obtained solutions are converted into projective tensors by inverse transform. By considering a large number of experiments, we show the effectiveness of our approach with respect to existing MDA methods.

Citations (1)

Summary

We haven't generated a summary for this paper yet.