Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Regularization and feature selection for large dimensional data (1712.01975v3)

Published 6 Dec 2017 in cs.LG, cs.NA, and math.OC

Abstract: Feature selection has evolved to be an important step in several machine learning paradigms. In domains like bio-informatics and text classification which involve data of high dimensions, feature selection can help in drastically reducing the feature space. In cases where it is difficult or infeasible to obtain sufficient number of training examples, feature selection helps overcome the curse of dimensionality which in turn helps improve performance of the classification algorithm. The focus of our research here are five embedded feature selection methods which use either the ridge regression, or Lasso regression, or a combination of the two in the regularization part of the optimization function. We evaluate five chosen methods on five large dimensional datasets and compare them on the parameters of sparsity and correlation in the datasets and their execution times.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Nand Sharma (5 papers)
  2. Prathamesh Verlekar (1 paper)
  3. Rehab Ashary (1 paper)
  4. Sui Zhiquan (1 paper)

Summary

We haven't generated a summary for this paper yet.