Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
129 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Data Mapping and Finite Difference Learning (1909.08210v3)

Published 18 Sep 2019 in cs.LG and stat.ML

Abstract: Restricted Boltzmann machine (RBM) is a two-layer neural network constructed as a probabilistic model and its training is to maximize a product of probabilities by the contrastive divergence (CD) scheme. In this paper a data mapping is proposed to describe the relationship between the visible and hidden layers and the training is to minimize a squared error on the visible layer by a finite difference learning. This paper presents three new properties in using the RBM: 1) nodes on the visible and hidden layers can take real-valued matrix data without a probabilistic interpretation; 2) the famous CD1 is a finite difference approximation of the gradient descent; 3) the activation can take non-sigmoid functions such as identity, relu and softsign. The data mapping provides a unified framework on the dimensionality reduction, the feature extraction and the data representation pioneered and developed by Hinton and his colleagues. As an approximation of the gradient descent, the finite difference learning is applicable to both directed and undirected graphs. Numerical experiments are performed to verify these new properties on the very low dimensionality reduction, the collinearity of timer series data and the use of flexible activations.

Summary

We haven't generated a summary for this paper yet.