Papers
Topics
Authors
Recent
Search
2000 character limit reached

BitWave: Exploiting Column-Based Bit-Level Sparsity for Deep Learning Acceleration

Published 16 Jul 2025 in eess.SY and cs.SY | (2507.12444v1)

Abstract: Bit-serial computation facilitates bit-wise sequential data processing, offering numerous benefits, such as a reduced area footprint and dynamically-adaptive computational precision. It has emerged as a prominent approach, particularly in leveraging bit-level sparsity in Deep Neural Networks (DNNs). However, existing bit-serial accelerators exploit bit-level sparsity to reduce computations by skipping zero bits, but they suffer from inefficient memory accesses due to the irregular indices of the non-zero bits. As memory accesses typically are the dominant contributor to DNN accelerator performance, this paper introduces a novel computing approach called "bit-column-serial" and a compatible architecture design named "BitWave." BitWave harnesses the advantages of the "bit-column-serial" approach, leveraging structured bit-level sparsity in combination with dynamic dataflow techniques. This achieves a reduction in computations and memory footprints through redundant computation skipping and weight compression. BitWave is able to mitigate the performance drop or the need for retraining that is typically associated with sparsity-enhancing techniques using a post-training optimization involving selected weight bit-flips. Empirical studies conducted on four deep-learning benchmarks demonstrate the achievements of BitWave: (1) Maximally realize 13.25x higher speedup, 7.71x efficiency compared to state-of-the-art sparsity-aware accelerators. (2) Occupying 1.138 mm2 area and consuming 17.56 mW power in 16nm FinFet process node.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.