Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
175 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Comparison of PDF Projection with Normalizing Flows and SurVAE (2311.14412v2)

Published 24 Nov 2023 in cs.LG and stat.ME

Abstract: Normalizing flows (NF) recently gained attention as a way to construct generative networks with exact likelihood calculation out of composable layers. However, NF is restricted to dimension-preserving transformations. Surjection VAE (SurVAE) has been proposed to extend NF to dimension-altering transformations. Such networks are desirable because they are expressive and can be precisely trained. We show that the approaches are a re-invention of PDF projection, which appeared over twenty years earlier and is much further developed.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (15)
  1. P. M. Baggenstoss, “A theoretically optimum approach to classification using class-specific features.” Proceedings of ICPR, Barcelona, 2000.
  2. ——, “The PDF projection theorem and the class-specific method,” IEEE Trans Signal Processing, pp. 672–685, March 2003.
  3. ——, “On the duality between belief networks and feed-forward neural networks,” IEEE Transactions on Neural Networks and Learning Systems, pp. 1–11, 2018.
  4. I. Kobyzev, S. J. D. Prince, and M. A. Brubaker, “Normalizing flows: An introduction and review of current methods,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 43, no. 11, pp. 3964–3979, 2021.
  5. D. Nielsen, P. Jaini, E. Hoogeboom, O. Winther, and M. Welling, “Survae flows: Surjections to bridge the gap between vaes and flows,” in NIPS 2020 (Virtual), 2020.
  6. P. M. Baggenstoss and S. Kay, “Nonlinear dimension reduction by pdf estimation,” IEEE Transactions on Signal Processing, 2022.
  7. P. M. Baggenstoss, “Beyond moments: Extending the maximum entropy principle to feature distribution constraints,” Entropy, vol. 20, no. 9, 2018. [Online]. Available: http://www.mdpi.com/1099-4300/20/9/650
  8. ——, “Maximum entropy PDF design using feature density constraints: Applications in signal processing,” IEEE Trans. Signal Processing, vol. 63, no. 11, Jun. 2015.
  9. S. M. Kay, A. H. Nuttall, and P. M. Baggenstoss, “Multidimensional probability density function approximations for detection, classification, and model order selection,” IEEE Transactions on Signal Processing, vol. 49, no. 10, pp. 2240–2252, Oct 2001.
  10. P. M. Baggenstoss, “Discriminative alignment of projected belief networks,” IEEE Signal Processing Letters, Sep 2021.
  11. ——, “Evaluating the RBM without integration using pdf projection,” in Proceedings of EUSIPCO 2017, Island of Kos, Greece, Aug 2017.
  12. ——, “Uniform manifold sampling (UMS): Sampling the maximum entropy pdf,” IEEE Transactions on Signal Processing, vol. 65, no. 9, pp. 2455–2470, May 2017.
  13. ——, “A neural network based on first principles,” in ICASSP 2020, Barcelona (virtual), Barcelona, Spain, Sep 2020.
  14. ——, “The class-specific classifier: Avoiding the curse of dimensionality (tutorial),” IEEE Aerospace and Electronic Systems Magazine, special Tutorial addendum, vol. 19, no. 1, pp. 37–52, January 2004.
  15. P. Baggenstoss, “PBN Toolkit,” {http://class-specific.com/pbntk}, accessed: 2022-02-28.
Citations (2)

Summary

We haven't generated a summary for this paper yet.