Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Efficient Approximate Solutions to Mutual Information Based Global Feature Selection (1706.07535v1)

Published 23 Jun 2017 in cs.LG and stat.ML

Abstract: Mutual Information (MI) is often used for feature selection when developing classifier models. Estimating the MI for a subset of features is often intractable. We demonstrate, that under the assumptions of conditional independence, MI between a subset of features can be expressed as the Conditional Mutual Information (CMI) between pairs of features. But selecting features with the highest CMI turns out to be a hard combinatorial problem. In this work, we have applied two unique global methods, Truncated Power Method (TPower) and Low Rank Bilinear Approximation (LowRank), to solve the feature selection problem. These algorithms provide very good approximations to the NP-hard CMI based feature selection problem. We experimentally demonstrate the effectiveness of these procedures across multiple datasets and compare them with existing MI based global and iterative feature selection procedures.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Hemanth Venkateswara (17 papers)
  2. Prasanth Lade (5 papers)
  3. Binbin Lin (50 papers)
  4. Jieping Ye (169 papers)
  5. Sethuraman Panchanathan (10 papers)
Citations (11)

Summary

We haven't generated a summary for this paper yet.