Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Kernelized Supervised Dictionary Learning (1207.2488v4)

Published 10 Jul 2012 in cs.CV and cs.LG

Abstract: In this paper, we propose supervised dictionary learning (SDL) by incorporating information on class labels into the learning of the dictionary. To this end, we propose to learn the dictionary in a space where the dependency between the signals and their corresponding labels is maximized. To maximize this dependency, the recently introduced Hilbert Schmidt independence criterion (HSIC) is used. One of the main advantages of this novel approach for SDL is that it can be easily kernelized by incorporating a kernel, particularly a data-derived kernel such as normalized compression distance, into the formulation. The learned dictionary is compact and the proposed approach is fast. We show that it outperforms other unsupervised and supervised dictionary learning approaches in the literature, using real-world data.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Ali Ghodsi (73 papers)
  2. Mohamed S. Kamel (8 papers)
  3. Mehrdad J. Gangeh (8 papers)
Citations (77)

Summary

We haven't generated a summary for this paper yet.