Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Variational mean-field theory for training restricted Boltzmann machines with binary synapses (1911.07662v2)

Published 11 Nov 2019 in stat.ML, cond-mat.dis-nn, cs.LG, cs.NE, and q-bio.NC

Abstract: Unsupervised learning requiring only raw data is not only a fundamental function of the cerebral cortex, but also a foundation for a next generation of artificial neural networks. However, a unified theoretical framework to treat sensory inputs, synapses and neural activity together is still lacking. The computational obstacle originates from the discrete nature of synapses, and complex interactions among these three essential elements of learning. Here, we propose a variational mean-field theory in which the distribution of synaptic weights is considered. The unsupervised learning can then be decomposed into two intertwined steps: a maximization step is carried out as a gradient ascent of the lower-bound on the data log-likelihood, in which the synaptic weight distribution is determined by updating variational parameters, and an expectation step is carried out as a message passing procedure on an equivalent or dual neural network whose parameter is specified by the variational parameters of the weight distribution. Therefore, our framework provides insights on how data (or sensory inputs), synapses and neural activities interact with each other to achieve the goal of extracting statistical regularities in sensory inputs. This variational framework is verified in restricted Boltzmann machines with planted synaptic weights and handwritten-digits learning.

Citations (1)

Summary

We haven't generated a summary for this paper yet.