2000 character limit reached
Learning Compact Convolutional Neural Networks with Nested Dropout (1412.7155v4)
Published 22 Dec 2014 in cs.CV, cs.LG, and cs.NE
Abstract: Recently, nested dropout was proposed as a method for ordering representation units in autoencoders by their information content, without diminishing reconstruction cost. However, it has only been applied to training fully-connected autoencoders in an unsupervised setting. We explore the impact of nested dropout on the convolutional layers in a CNN trained by backpropagation, investigating whether nested dropout can provide a simple and systematic way to determine the optimal representation size with respect to the desired accuracy and desired task and data complexity.