Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Convergence of Contrastive Divergence with Annealed Learning Rate in Exponential Family (1605.06220v1)

Published 20 May 2016 in stat.ML and cs.LG

Abstract: In our paper, we showed that in exponential family, contrastive divergence (CD) with fixed learning rate will give asymptotically consistent estimates \cite{wu2016convergence}. In this paper, we establish consistency and convergence rate of CD with annealed learning rate $\eta_t$. Specifically, suppose CD-$m$ generates the sequence of parameters ${\theta_t}{t \ge 0}$ using an i.i.d. data sample $\mathbf{X}_1n \sim p{\theta*}$ of size $n$, then $\delta_n(\mathbf{X}1n) = \limsup{t \to \infty} \Vert \sum_{s=t_0}t \eta_s \theta_s / \sum_{s=t_0}t \eta_s - \theta* \Vert$ converges in probability to 0 at a rate of $1/\sqrt[3]{n}$. The number ($m$) of MCMC transitions in CD only affects the coefficient factor of convergence rate. Our proof is not a simple extension of the one in \cite{wu2016convergence}. which depends critically on the fact that ${\theta_t}_{t \ge 0}$ is a homogeneous Markov chain conditional on the observed sample $\mathbf{X}_1n$. Under annealed learning rate, the homogeneous Markov property is not available and we have to develop an alternative approach based on super-martingales. Experiment results of CD on a fully-visible $2\times 2$ Boltzmann Machine are provided to demonstrate our theoretical results.

Summary

We haven't generated a summary for this paper yet.