Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Adaptive Group Lasso Neural Network Models for Functions of Few Variables and Time-Dependent Data (2108.10825v2)

Published 24 Aug 2021 in cs.LG, cs.NA, and math.NA

Abstract: In this paper, we propose an adaptive group Lasso deep neural network for high-dimensional function approximation where input data are generated from a dynamical system and the target function depends on few active variables or few linear combinations of variables. We approximate the target function by a deep neural network and enforce an adaptive group Lasso constraint to the weights of a suitable hidden layer in order to represent the constraint on the target function. We utilize the proximal algorithm to optimize the penalized loss function. Using the non-negative property of the Bregman distance, we prove that the proposed optimization procedure achieves loss decay. Our empirical studies show that the proposed method outperforms recent state-of-the-art methods including the sparse dictionary matrix method, neural networks with or without group Lasso penalty.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Lam Si Tung Ho (34 papers)
  2. Nicholas Richardson (3 papers)
  3. Giang Tran (15 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.