Learning to Importance Sample in Primary Sample Space (1808.07840v2)
Abstract: Importance sampling is one of the most widely used variance reduction strategies in Monte Carlo rendering. In this paper, we propose a novel importance sampling technique that uses a neural network to learn how to sample from a desired density represented by a set of samples. Our approach considers an existing Monte Carlo rendering algorithm as a black box. During a scene-dependent training phase, we learn to generate samples with a desired density in the primary sample space of the rendering algorithm using maximum likelihood estimation. We leverage a recent neural network architecture that was designed to represent real-valued non-volume preserving ('Real NVP') transformations in high dimensional spaces. We use Real NVP to non-linearly warp primary sample space and obtain desired densities. In addition, Real NVP efficiently computes the determinant of the Jacobian of the warp, which is required to implement the change of integration variables implied by the warp. A main advantage of our approach is that it is agnostic of underlying light transport effects, and can be combined with many existing rendering techniques by treating them as a black box. We show that our approach leads to effective variance reduction in several practical scenarios.
- TensorFlow: Large-scale machine learning on heterogeneous systems, 2015. Software available from tensorflow.org. URL: https://www.tensorflow.org/.
- Reversible jump metropolis light transport using inverse mappings. ACM Transactions on Graphics (TOG) 37, 1 (2017), 1.
- Kernel-predicting convolutional networks for denoising monte carlo renderings. ACM Trans. Graph. 36, 4 (July 2017), 97:1–97:14.
- Wavelet importance sampling: Efficiently evaluating products of complex functions. ACM Trans. Graph. 24, 3 (July 2005), 1166–1175.
- Interactive Reconstruction of Monte Carlo Image Sequences using a Recurrent Denoising Autoencoder. ACM Transactions on Graphics (Aug 2017).
- Dahm K., Keller A.: Learning light transport the reinforced way. arXiv preprint arXiv:1701.07403 (2017).
- Density estimation using real NVP. CoRR abs/1605.08803 (2016).
- Glorot X., Bengio Y.: Understanding the difficulty of training deep feedforward neural networks. In Proceedings of the thirteenth international conference on artificial intelligence and statistics (2010), pp. 249–256.
- Primary sample space path guiding. In Eurographics Symposium on Rendering - EI & I (July 2018), Jakob W., Hachisuka T., (Eds.), Eurographics, The Eurographics Association, pp. 73–82. DOI = 10.2312/sre.20181174.
- Generative adversarial nets. In Advances in Neural Information Processing Systems 27, Ghahramani Z., Welling M., Cortes C., Lawrence N. D., Weinberger K. Q., (Eds.). Curran Associates, Inc., 2014, pp. 2672–2680.
- Heitz E., d’Eon E.: Importance sampling microfacet-based bsdfs using the distribution of visible normals. Computer Graphics Forum 33, 4 (2014), 103–112.
- Multiplexed metropolis light transport. ACM Transactions on Graphics (TOG) 33, 4 (2014), 100.
- Deep residual learning for image recognition. In 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (June 2016), pp. 770–778.
- Jakob W., Marschner S.: Manifold exploration: a markov chain monte carlo technique for rendering scenes with difficult specular transport. ACM Transactions on Graphics (TOG) 31, 4 (2012), 58.
- Kajiya J. T.: The rendering equation. In ACM Siggraph Computer Graphics (1986), vol. 20, ACM, pp. 143–150.
- Kingma D. P., Ba J.: Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014).
- The natural-constraint representation of the path space for efficient light transport simulation. ACM Transactions on Graphics (TOG) 33, 4 (2014), 102.
- Deep scattering: Rendering atmospheric clouds with radiance-predicting neural networks. ACM Trans. Graph. 36, 6 (Nov. 2017), 231:1–231:11.
- A simple and robust mutation strategy for the metropolis light transport algorithm. In Computer Graphics Forum (2002), vol. 21, Wiley Online Library, pp. 531–540.
- Kingma D. P., Welling M.: Auto-encoding variational bayes. In ICLR (2014).
- Anisotropic gaussian mutations for metropolis light transport through hessian-hamiltonian dynamics. ACM Transactions on Graphics (TOG) 34, 6 (2015), 209.
- Practical path guiding for efficient light-transport simulation. In Computer Graphics Forum (2017), vol. 36, Wiley Online Library, pp. 91–100.
- Neural importance sampling. arXiv preprint arXiv:1808.03856 (2018).
- Fusing state spaces for markov chain monte carlo rendering. ACM Transactions on Graphics (TOG) 36, 4 (2017), 74.
- Physically Based Rendering: From Theory to Implementation, 3 ed. Morgan Kaufmann, 2016.
- Global illumination with radiance regression functions. ACM Trans. Graph. 32, 4 (July 2013), 130:1–130:12.
- Importance resampling for global illumination. In Proceedings of the Sixteenth Eurographics Conference on Rendering Techniques (Aire-la-Ville, Switzerland, Switzerland, 2005), EGSR ’05, Eurographics Association, pp. 139–146.
- Veach E., Guibas L. J.: Optimally combining sampling techniques for monte carlo rendering. In Proceedings of the 22nd annual conference on Computer graphics and interactive techniques (1995), ACM, pp. 419–428.
- Veach E., Guibas L. J.: Metropolis light transport. In Proceedings of the 24th annual conference on Computer graphics and interactive techniques (1997), ACM Press/Addison-Wesley Publishing Co., pp. 65–76.
- Bayesian online regression for adaptive direct illumination sampling. ACM Transactions on Graphics (Proceedings of SIGGRAPH 2018) 37, 4 (2018).
- On-line learning of parametric mixture models for light transport simulation. ACM Transactions on Graphics (TOG) 33, 4 (2014), 101.
- Image quality assessment: from error visibility to structural similarity. IEEE transactions on image processing 13, 4 (2004), 600–612.
- Lightcuts: A scalable approach to illumination. ACM Trans. Graph. 24, 3 (July 2005), 1098–1107.