Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Generalized Holographic Reduced Representations (2405.09689v1)

Published 15 May 2024 in cs.LG, cs.AI, and cs.SC

Abstract: Deep learning has achieved remarkable success in recent years. Central to its success is its ability to learn representations that preserve task-relevant structure. However, massive energy, compute, and data costs are required to learn general representations. This paper explores Hyperdimensional Computing (HDC), a computationally and data-efficient brain-inspired alternative. HDC acts as a bridge between connectionist and symbolic approaches to AI, allowing explicit specification of representational structure as in symbolic approaches while retaining the flexibility of connectionist approaches. However, HDC's simplicity poses challenges for encoding complex compositional structures, especially in its binding operation. To address this, we propose Generalized Holographic Reduced Representations (GHRR), an extension of Fourier Holographic Reduced Representations (FHRR), a specific HDC implementation. GHRR introduces a flexible, non-commutative binding operation, enabling improved encoding of complex data structures while preserving HDC's desirable properties of robustness and transparency. In this work, we introduce the GHRR framework, prove its theoretical properties and its adherence to HDC properties, explore its kernel and binding characteristics, and perform empirical experiments showcasing its flexible non-commutativity, enhanced decoding accuracy for compositional structures, and improved memorization capacity compared to FHRR.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (35)
  1. \bibcommenthead
  2. ImageNet Classification with Deep Convolutional Neural Networks. In: Advances in Neural Information Processing Systems. vol. 25. Curran Associates, Inc.; 2012. .
  3. Generative Adversarial Nets. In: Advances in Neural Information Processing Systems. vol. 27. Curran Associates, Inc.; 2014. .
  4. Denoising Diffusion Probabilistic Models. In: Advances in Neural Information Processing Systems. vol. 33. Curran Associates, Inc.; 2020. p. 6840–6851.
  5. Attention Is All You Need. In: Advances in Neural Information Processing Systems. vol. 30. Curran Associates, Inc.; 2017. .
  6. Language Models Are Few-Shot Learners. In: Advances in Neural Information Processing Systems. vol. 33. Curran Associates, Inc.; 2020. p. 1877–1901.
  7. On the Opportunities and Risks of Foundation Models. ArXiv. 2021;.
  8. arXiv.
  9. Kanerva P. Hyperdimensional Computing: An Introduction to Computing in Distributed Representation with High-Dimensional Random Vectors. Cognitive Computation. 2009 Jun;1(2):139–159. 10.1007/s12559-009-9009-8.
  10. A Survey on Hyperdimensional Computing Aka Vector Symbolic Architectures, Part I: Models and Data Transformations. ACM Computing Surveys. 2023 Jul;55(6):1–40. 10.1145/3538531. arxiv:2111.06077. [cs].
  11. Autoscaling Bloom Filter: Controlling Trade-off between True and False Positives. Neural Computing and Applications. 2020 Apr;32(8):3675–3684. 10.1007/s00521-019-04397-1.
  12. Resonator Networks, 1: An Efficient Solution for Factoring High-Dimensional, Distributed Representations of Data Structures. Neural Computation. 2020 Dec;32(12):2311–2331. 10.1162/neco_a_01331.
  13. Vector Symbolic Architectures as a Computing Framework for Emerging Hardware. Proceedings of the IEEE. 2022 Oct;110(10):1538–1571. 10.1109/JPROC.2022.3209104. arxiv:2106.05268. [cs].
  14. GrapHD: Graph-Based Hyperdimensional Memorization for Brain-Like Cognitive Learning. Frontiers in Neuroscience. 2022;16.
  15. A Neuro-Vector-Symbolic Architecture for Solving Raven’s Progressive Matrices. Nature Machine Intelligence. 2023 Mar;5(4):363–375. 10.1038/s42256-023-00630-8.
  16. Probabilistic Abduction for Visual Abstract Reasoning via Learning Rules in Vector-symbolic Architectures. In: Annual Conference on Neural Information Processing Systems; 2023. .
  17. Plate TA. Holographic Reduced Representations. IEEE Transactions on Neural Networks. 1995 May;6(3):623–641. 10.1109/72.377968.
  18. Plate TA. Holographic Reduced Representations: Convolution Algebra for Compositional Distributed Representations. In: International Joint Conference on Artificial Intelligence; 1991. Available from: https://api.semanticscholar.org/CorpusID:1197280.
  19. Rahimi A, Recht B. Random Features for Large-Scale Kernel Machines. In: Platt J, Koller D, Singer Y, Roweis S, editors. Advances in Neural Information Processing Systems. vol. 20. Curran Associates, Inc.; 2007. .
  20. Smolensky P. Tensor Product Variable Binding and the Representation of Symbolic Structures in Connectionist Systems. Artificial Intelligence. 1990 Nov;46(1):159–216. 10.1016/0004-3702(90)90007-M.
  21. A survey on hyperdimensional computing aka vector symbolic architectures, part ii: Applications, cognitive models, and challenges. ACM Computing Surveys. 2023;55(9):1–52.
  22. Gayler RW.: Multiplicative Binding, Representation Operators & Analogy (Workshop Poster). Available from: http://cogprints.org/502/.
  23. Plate TA. Distributed representations and nested compositional structure. Citeseer; 1994.
  24. Computing on Functions Using Randomized Vector Representations (in Brief). In: Neuro-Inspired Computational Elements Conference. Virtual Event USA: ACM; 2022. p. 115–122.
  25. Graphd: Graph-based hyperdimensional memorization for brain-like cognitive learning. Frontiers in Neuroscience. 2022;p. 5.
  26. GraphHD: Efficient graph classification using hyperdimensional computing. In: 2022 Design, Automation & Test in Europe Conference & Exhibition (DATE). IEEE; 2022. p. 1485–1490.
  27. Gayler RW, Levy SD. A distributed basis for analogical mapping. In: New Frontiers in Analogy Research; Proc. of 2nd Intern. Analogy Conf. vol. 9; 2009. .
  28. Hyperseed: Unsupervised learning with vector symbolic architectures. IEEE Transactions on Neural Networks and Learning Systems. 2022;.
  29. Efficient human activity recognition using hyperdimensional computing. In: Proceedings of the 8th International Conference on the Internet of Things; 2018. p. 1–6.
  30. Hierarchical hyperdimensional computing for energy efficient classification. In: Proceedings of the 55th Annual Design Automation Conference; 2018. p. 1–6.
  31. Quanthd: A quantization framework for hyperdimensional computing. IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems. 2019;39(10):2268–2278.
  32. A Theoretical Perspective on Hyperdimensional Computing. Journal of Artificial Intelligence Research. 2022 Jan;72:215–249. 10.1613/jair.1.12664.
  33. Gayler RW. Vector symbolic architectures answer Jackendoff’s challenges for cognitive neuroscience. arXiv preprint cs/0412059. 2004;.
  34. A Theory of Sequence Indexing and Working Memory in Recurrent Neural Networks. Neural Computation. 2018 Jun;30(6):1449–1513. 10.1162/neco_a_01084.
  35. High-dimensional computing as a nanoscalable paradigm. IEEE Transactions on Circuits and Systems I: Regular Papers. 2017;64(9):2508–2521.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com
Youtube Logo Streamline Icon: https://streamlinehq.com