Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Noise-Augmented $\ell_0$ Regularization of Tensor Regression with Tucker Decomposition (2302.10775v2)

Published 19 Feb 2023 in stat.ML

Abstract: Tensor data are multi-dimension arrays. Low-rank decomposition-based regression methods with tensor predictors exploit the structural information in tensor predictors while significantly reducing the number of parameters in tensor regression. We propose a method named NA$_0$CT$2$ (Noise Augmentation for $\ell_0$ regularization on Core Tensor in Tucker decomposition) to regularize the parameters in tensor regression (TR), coupled with Tucker decomposition. We establish theoretically that NA$_0$CT$2$ achieves exact $\ell_0$ regularization on the core tensor from the Tucker decomposition in linear TR and generalized linear TR. To our knowledge, NA$_0$CT$2$ is the first Tucker decomposition-based regularization method in TR to achieve $\ell_0$ in core tensors. NA$_0$CT$2$ is implemented through an iterative procedure and involves two straightforward steps in each iteration -- generating noisy data based on the core tensor from the Tucker decomposition of the updated parameter estimate and running a regular GLM on noise-augmented data on vectorized predictors. We demonstrate the implementation of NA$_0$CT$2$ and its $\ell_0$ regularization effect in both simulation studies and real data applications. The results suggest that NA$_0$CT$2$ can improve predictions compared to other decomposition-based TR approaches, with or without regularization and it identifies important predictors though not designed for that purpose.

Summary

We haven't generated a summary for this paper yet.