Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Leveraging Recursive Gumbel-Max Trick for Approximate Inference in Combinatorial Spaces (2110.15072v1)

Published 28 Oct 2021 in cs.LG

Abstract: Structured latent variables allow incorporating meaningful prior knowledge into deep learning models. However, learning with such variables remains challenging because of their discrete nature. Nowadays, the standard learning approach is to define a latent variable as a perturbed algorithm output and to use a differentiable surrogate for training. In general, the surrogate puts additional constraints on the model and inevitably leads to biased gradients. To alleviate these shortcomings, we extend the Gumbel-Max trick to define distributions over structured domains. We avoid the differentiable surrogates by leveraging the score function estimators for optimization. In particular, we highlight a family of recursive algorithms with a common feature we call stochastic invariant. The feature allows us to construct reliable gradient estimates and control variates without additional constraints on the model. In our experiments, we consider various structured latent variable models and achieve results competitive with relaxation-based counterparts.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Kirill Struminsky (8 papers)
  2. Artyom Gadetsky (7 papers)
  3. Denis Rakitin (3 papers)
  4. Danil Karpushkin (1 paper)
  5. Dmitry Vetrov (84 papers)
Citations (8)

Summary

We haven't generated a summary for this paper yet.