2000 character limit reached
Structured Convolution Matrices for Energy-efficient Deep learning (1606.02407v1)
Published 8 Jun 2016 in cs.NE, cs.AI, cs.CV, and cs.LG
Abstract: We derive a relationship between network representation in energy-efficient neuromorphic architectures and block Toplitz convolutional matrices. Inspired by this connection, we develop deep convolutional networks using a family of structured convolutional matrices and achieve state-of-the-art trade-off between energy efficiency and classification accuracy for well-known image recognition tasks. We also put forward a novel method to train binary convolutional networks by utilising an existing connection between noisy-rectified linear units and binary activations.