2000 character limit reached
Learning Discriminative Representation with Signed Laplacian Restricted Boltzmann Machine (1808.09389v1)
Published 28 Aug 2018 in cs.CV
Abstract: We investigate the potential of a restricted Boltzmann Machine (RBM) for discriminative representation learning. By imposing the class information preservation constraints on the hidden layer of the RBM, we propose a Signed Laplacian Restricted Boltzmann Machine (SLRBM) for supervised discriminative representation learning. The model utilizes the label information and preserves the global data locality of data points simultaneously. Experimental results on the benchmark data set show the effectiveness of our method.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.