Self-Attention Based Semantic Decomposition in Vector Symbolic Architectures (2403.13218v1)
Abstract: Vector Symbolic Architectures (VSAs) have emerged as a novel framework for enabling interpretable machine learning algorithms equipped with the ability to reason and explain their decision processes. The basic idea is to represent discrete information through high dimensional random vectors. Complex data structures can be built up with operations over vectors such as the "binding" operation involving element-wise vector multiplication, which associates data together. The reverse task of decomposing the associated elements is a combinatorially hard task, with an exponentially large search space. The main algorithm for performing this search is the resonator network, inspired by Hopfield network-based memory search operations. In this work, we introduce a new variant of the resonator network, based on self-attention based update rules in the iterative search problem. This update rule, based on the Hopfield network with log-sum-exp energy function and norm-bounded states, is shown to substantially improve the performance and rate of convergence. As a result, our algorithm enables a larger capacity for associative memory, enabling applications in many tasks like perception based pattern recognition, scene decomposition, and object reasoning. We substantiate our algorithm with a thorough evaluation and comparisons to baselines.
- Jie Huang and Kevin Chen-Chuan Chang. Towards reasoning in large language models: A survey. arXiv preprint arXiv:2212.10403, 2022.
- Large language models cannot self-correct reasoning yet. arXiv preprint arXiv:2310.01798, 2023.
- Siren’s song in the ai ocean: a survey on hallucination in large language models. arXiv preprint arXiv:2309.01219, 2023.
- Halueval: A large-scale hallucination evaluation benchmark for large language models. In The 2023 Conference on Empirical Methods in Natural Language Processing, 2023.
- A survey on hyperdimensional computing aka vector symbolic architectures, part ii: Applications, cognitive models, and challenges. ACM Computing Surveys, 55(9):1–52, 2023.
- Pentti Kanerva. Hyperdimensional computing: An introduction to computing in distributed representation with high-dimensional vectors. Cognitive Computation, 2009.
- Resonator networks, 1: an efficient solution for factoring high-dimensional, distributed representations of data structures. Neural computation, 32(12):2311–2331, 2020.
- Connectionism and cognitive architecture: A critical analysis. Cognition, 28(1-2):3–71, 1988.
- Cognitive correlative encoding for genome sequence matching in hyperdimensional system. In 2021 58th ACM/IEEE Design Automation Conference (DAC), pages 781–786. IEEE, 2021.
- Biohd: an efficient genome sequence search platform using hyperdimensional memorization. In Proceedings of the 49th Annual International Symposium on Computer Architecture, pages 656–669, 2022.
- Stochd: Stochastic hyperdimensional system for efficient and robust learning from raw data. In IEEE/ACM Design Automation Conference (DAC), 2021.
- Robust in-memory computing with hyperdimensional stochastic representation. In 2021 IEEE/ACM International Symposium on Nanoscale Architectures (NANOARCH), pages 1–6. IEEE, 2021.
- Adaptive neural recovery for highly robust brain-like representation. In Proceedings of the 59th ACM/IEEE Design Automation Conference, pages 367–372, 2022.
- Neural computation for robust and holographic face detection. In Proceedings of the 59th ACM/IEEE Design Automation Conference, pages 31–36, 2022.
- A neuro-vector-symbolic architecture for solving raven’s progressive matrices. Nature Machine Intelligence, 5(4):363–375, 2023.
- Resonator networks, 2: Factorization performance and capacity compared to optimization-based methods. Neural computation, 32(12):2332–2388, 2020.
- In-memory factorization of holographic perceptual representations. Nature Nanotechnology, 18(5):479–485, May 2023.
- Hopfield Networks is All You Need, April 2021.
- Pentti Kanerva. Hyperdimensional Computing: An Introduction to Computing in Distributed Representation with High-Dimensional Random Vectors. Cognitive Computation, 1(2):139–159, June 2009.
- Random features for large-scale kernel machines. Advances in neural information processing systems, 20, 2007.
- A theory of sequence indexing and working memory in recurrent neural networks. Neural Computation, 30(6):1449–1513, 2018.
- Graphd: Graph-based hyperdimensional memorization for brain-like cognitive learning. Frontiers in Neuroscience, page 5, 2022.
- Associative synthesis of finite state automata model of a controlled object with hyperdimensional computing. In IECON 2017-43rd Annual Conference of the IEEE Industrial Electronics Society, pages 3276–3281. IEEE, 2017.
- Hyperdimensional quantum factorization. Under review.
- Hdqmf: Holographic feature decomposition using quantum algorithms. CVPR 2024.
- Frank Rosenblatt et al. Principles of neurodynamics: Perceptrons and the theory of brain mechanisms, volume 55. Spartan books Washington, DC, 1962.
- Binding and normalization of binary sparse distributed representations by context-dependent thinning. Neural Computation, 13(2):411–452, 2001.
- Graphhd: Efficient graph classification using hyperdimensional computing. In 2022 Design, Automation & Test in Europe Conference & Exhibition (DATE), pages 1485–1490. IEEE, 2022.
- Computing on functions using randomized vector representations. arXiv e-prints, pages arXiv–2109, 2021.
- A distributed basis for analogical mapping. In New Frontiers in Analogy Research; Proc. of 2nd Intern. Analogy Conf, volume 9, 2009.
- Hyperseed: Unsupervised learning with vector symbolic architectures. IEEE Transactions on Neural Networks and Learning Systems, 2022.
- Efficient human activity recognition using hyperdimensional computing. In Proceedings of the 8th International Conference on the Internet of Things, pages 1–6, 2018.
- Hierarchical hyperdimensional computing for energy efficient classification. In Proceedings of the 55th Annual Design Automation Conference, pages 1–6, 2018.
- Quanthd: A quantization framework for hyperdimensional computing. IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, 39(10):2268–2278, 2019.
- A theoretical perspective on hyperdimensional computing. Journal of Artificial Intelligence Research, 72:215–249, 2021.
- J J Hopfield. Neural networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Sciences, 79(8):2554–2558, April 1982.
- Resonator networks outperform optimization methods at solving high-dimensional vector factorization. arXiv preprint arXiv:1906.11684, 2019.