Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

BoolNet: Minimizing The Energy Consumption of Binary Neural Networks (2106.06991v1)

Published 13 Jun 2021 in cs.LG and cs.AI

Abstract: Recent works on Binary Neural Networks (BNNs) have made promising progress in narrowing the accuracy gap of BNNs to their 32-bit counterparts. However, the accuracy gains are often based on specialized model designs using additional 32-bit components. Furthermore, almost all previous BNNs use 32-bit for feature maps and the shortcuts enclosing the corresponding binary convolution blocks, which helps to effectively maintain the accuracy, but is not friendly to hardware accelerators with limited memory, energy, and computing resources. Thus, we raise the following question: How can accuracy and energy consumption be balanced in a BNN network design? We extensively study this fundamental problem in this work and propose a novel BNN architecture without most commonly used 32-bit components: \textit{BoolNet}. Experimental results on ImageNet demonstrate that BoolNet can achieve 4.6x energy reduction coupled with 1.2\% higher accuracy than the commonly used BNN architecture Bi-RealNet. Code and trained models are available at: https://github.com/hpi-xnor/BoolNet.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Nianhui Guo (2 papers)
  2. Joseph Bethge (9 papers)
  3. Haojin Yang (38 papers)
  4. Kai Zhong (21 papers)
  5. Xuefei Ning (52 papers)
  6. Christoph Meinel (51 papers)
  7. Yu Wang (939 papers)
Citations (11)