Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Exponential Lower Bounds for Threshold Circuits of Sub-Linear Depth and Energy (2107.00223v2)

Published 1 Jul 2021 in cs.CC, cs.CV, cs.LG, and cs.NE

Abstract: In this paper, we investigate computational power of threshold circuits and other theoretical models of neural networks in terms of the following four complexity measures: size (the number of gates), depth, weight and energy. Here the energy complexity of a circuit measures sparsity of their computation, and is defined as the maximum number of gates outputting non-zero values taken over all the input assignments. As our main result, we prove that any threshold circuit $C$ of size $s$, depth $d$, energy $e$ and weight $w$ satisfies $\log (rk(M_C)) \le ed (\log s + \log w + \log n)$, where $rk(M_C)$ is the rank of the communication matrix $M_C$ of a $2n$-variable Boolean function that $C$ computes. Thus, such a threshold circuit $C$ is able to compute only a Boolean function of which communication matrix has rank bounded by a product of logarithmic factors of $s,w$ and linear factors of $d,e$. This implies an exponential lower bound on the size of even sublinear-depth threshold circuit if energy and weight are sufficiently small. For other models of neural networks such as a discretized ReLE circuits and decretized sigmoid circuits, we prove that a similar inequality also holds for a discretized circuit $C$: $rk(M_C) = O(ed(\log s + \log w + \log n)3)$.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (48)
  1. Kazuyuki Amano. On the size of depth-two threshold circuits for the Inner Product mod 2 function. In Alberto Leporati, Carlos Martín-Vide, Dana Shapira, and Claudio Zandron, editors, Language and Automata Theory and Applications, pages 235–247, Cham, 2020. Springer International Publishing.
  2. On the complexity of depth-2 circuits with threshold gates. In Proc. of the 30th international conference on Mathematical Foundations of Computer Science, pages 107–118, 2005.
  3. What can we learn from synaptic weight distributions? Trends in Neurosciences, 30(12):622–629, 2007. URL: https://www.sciencedirect.com/science/article/pii/S0166223607002615, doi:https://doi.org/10.1016/j.tins.2007.09.005.
  4. Average-case lower bounds and satisfiability algorithms for small threshold circuits. Theory of Computing, 14(9):1–55, 2018.
  5. Binaryconnect: Training deep neural networks with binary weights during propagations. In Proceedings of the 28th International Conference on Neural Information Processing Systems - Volume 2, NIPS’15, pages 3123–3131, Cambridge, MA, USA, 2015. MIT Press.
  6. Untangling invariant object recognition. Trends in Cognitive Sciences, 11(8):333–341, 2007. doi:10.1016/j.tics.2007.06.010.
  7. New bounds for energy complexity of boolean functions. Theoretical Computer Science, 845:59–75, 2020.
  8. Peter Földiák. Sparse coding in the primate cortex. In M. A. Arbib, editor, The Handbook of Brain Theory and Neural Networks, pages 1064–1068. MIT Press, 2003.
  9. Relations between communication complexity, linear arrangements, and computational complexity. In Proc. of the 21st International Conference on Foundations of Software Technology and Theoretical Computer Science, pages 171–182, 2001.
  10. Threshold circuits of bounded depth. Journal of Computer and System Sciences, 46:129–154, 1993.
  11. On the power of small-depth threshold circuits. Computational Complexity, 1(2):113–129, 1991.
  12. Unsupervised feature learning by deep sparse coding. In Proc. of SIAM International Conference on Data Mining, pages 902–910, 2014.
  13. Quantized neural networks: Training neural networks with low precision weights and activations. Journal of Machine Learning Research, 18(187):1–30, 2018. URL: http://jmlr.org/papers/v18/16-456.html.
  14. Size-depth tradeoffs for threshold circuits. SIAM Journal on Computing, 26(3):693–707, 1997.
  15. Stasys Jukna. Extremal Combinatorics with Applications in Computer Science. Springer-Verlag Berlin Heidelberg, 2011.
  16. D. M. Kane and R. Williams. Super-linear gate and super-quadratic wire lower bounds for depth-two and depth-three threshold circuits. In Proc. of the 48th Annual ACM Symposium on Theory of Computing, pages 633–643, 2016.
  17. O. M. Kasim-zade. On a measure of active circuits of functional elements (russian). Mathematical problems in cybernetics “Nauka”, (4):218?–228, 1992.
  18. Efficient sparse coding algorithms. In Proc. of the 19th Advances in Neural Information Processing Systems, pages 801–808, 2006.
  19. Peter Lennie. The cost of cortical computation. Current Biology, 13:493–497, 2003.
  20. A Basic Compositional Model for Spiking Neural Networks, pages 403–449. Springer Nature Switzerland, 2022. doi:10.1007/978-3-031-15629-8_22.
  21. On the computational power of sigmoid versus boolean threshold circuits. In Proc. of 32nd Annual Symposium of Foundations of Computer Science, pages 767–776, 1991. doi:10.1109/SFCS.1991.185447.
  22. Wolfgang Maass. Networks of spiking neurons: The third generation of neural network models. Neural Networks, 10(9):1659–1671, 1997.
  23. Brain Computation: A Computer Science Perspective, pages 184–199. Springer International Publishing, 2019. doi:10.1007/978-3-319-91908-9_11.
  24. Computational power of threshold circuits of energy at most two. IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences, E101.A(9):1431–1439, 2018.
  25. David Marr. Vision: A Computational Investigation into the Human Representation and Processing of Visual Information. W.H.Freeman & Co Ltd, 1982.
  26. A logical calculus of the ideas immanent in nervous activity. The bulletin of mathematical biophysics, 5:115–133, 1943.
  27. Perceptrons: An Introduction to Computational Geometry. MIT Press, 1988.
  28. Andrew Ng. Sparse autoencoder. CS294A Lecture notes, 2011.
  29. Noam Nisan. The communication complexity of threshold gates. Proceeding of “Combinatorics, Paul Erdo¨normal-¨𝑜\ddot{o}over¨ start_ARG italic_o end_ARGs is Eighty”, pages 301–315, 1993.
  30. Sparse coding of sensory inputs. Current Opinion in Neurobiology, 14(4):481–487, 2004.
  31. Ian Parberry. Circuit Complexity and Neural Networks. MIT Press, 1994.
  32. Is a 4-bit synaptic weight resolution enough? - constraints on enabling spike-timing dependent plasticity in neuromorphic hardware. Frontiers in Neuroscience, 6:90, 2012. URL: https://www.frontiersin.org/article/10.3389/fnins.2012.00090, doi:10.3389/fnins.2012.00090.
  33. The sign-rank of AC00{}^{0}start_FLOATSUPERSCRIPT 0 end_FLOATSUPERSCRIPT. SIAM Journal on Computing, 39(5):1833–1855, 2010.
  34. Frank Rosenblatt. The perceptron: A probabilistic model for information storage and organization in the brain. Psychological Review, 65(6):386–408, 1958.
  35. Janio Carlos Nascimento Silva and Uéverton S. Souza. Computing the best-case energy complexity of satisfying assignments in monotone circuits. Theoretical Computer Science, 932:41–55, 2022. URL: https://www.sciencedirect.com/science/article/pii/S0304397522004777, doi:https://doi.org/10.1016/j.tcs.2022.08.005.
  36. K. Y. Siu and J. Bruck. On the power of threshold circuits with small weights. SIAM Journal on Discrete Mathematics, 4(3):423–435, 1991.
  37. Discrete Neural Computation: A Theoretical Foundation. Prentice Hall, 1995.
  38. On optimal depth threshold circuits for multiplication and related problems. SIAM Journal on Discrete Mathematics, 7(2):284–292, 1994.
  39. On the relationship between energy complexity and other boolean function measures. In Proc. of the 25th International Computing and Combinatorics Conference, pages 516–528, 2019.
  40. Energy-efficient threshold circuits computing MOD functions. In Proc. of the 17th Computing: the Australasian Theory Symposium, pages 105–110, 2011.
  41. Energy-efficient threshold circuits computing MOD functions. International Journal of Foundations of Computer Science, 24(1):15–29, 2013.
  42. Kei Uchizawa. Lower bounds for threshold circuits of bounded energy. Interdisciplinary Information Sciences, 20(1):27–50, 2014.
  43. Kei Uchizawa. Size, Depth and Energy of Threshold Circuits Computing Parity Function. In Yixin Cao, Siu-Wing Cheng, and Minming Li, editors, 31st International Symposium on Algorithms and Computation (ISAAC 2020), volume 181 of Leibniz International Proceedings in Informatics (LIPIcs), pages 54:1–54:13, Dagstuhl, Germany, 2020. Schloss Dagstuhl–Leibniz-Zentrum für Informatik. URL: https://drops.dagstuhl.de/opus/volltexte/2020/13398, doi:10.4230/LIPIcs.ISAAC.2020.54.
  44. On the computational power of threshold circuits with sparse activity. Neural Computation, 18(12):2994–3008, 2008.
  45. Exponential lower bounds on the size of constant-depth threshold circuits with small energy complexity. Theoretical Computer Science, 407(1–3):474–487, 2008.
  46. Size-energy tradeoffs of unate circuits computing symmetric Boolean functions. Theoretical Computer Science, 412:773–782, 2011.
  47. M. N. Vaintsvaig. On the power of networks of functional elements (Russian). Doklady Akademii Nauk, 139(2):320?323, 1961.
  48. Leslie G Valiant. What must a global theory of cortex explain? Current Opinion in Neurobiology, 25:15–19, 2014. doi:https://doi.org/10.1016/j.conb.2013.10.006.
Citations (1)

Summary

We haven't generated a summary for this paper yet.