Sparse distributed representation, hierarchy, critical periods, metaplasticity: the keys to lifelong fixed-time learning and best-match retrieval (1806.04704v1)
Abstract: Among the more important haLLMarks of human intelligence, which any artificial general intelligence (AGI) should have, are the following. 1. It must be capable of on-line learning, including with single/few trials. 2. Memories/knowledge must be permanent over lifelong durations, safe from catastrophic forgetting. Some confabulation, i.e., semantically plausible retrieval errors, may gradually accumulate over time. 3. The time to both: a) learn a new item, and b) retrieve the best-matching / most relevant item(s), i.e., do similarity-based retrieval, must remain constant throughout the lifetime. 4. The system should never become full: it must remain able to store new information, i.e., make new permanent memories, throughout very long lifetimes. No artificial computational system has been shown to have all these properties. Here, we describe a neuromorphic associative memory model, Sparsey, which does, in principle, possess them all. We cite prior results supporting possession of haLLMarks 1 and 3 and sketch an argument, hinging on strongly recursive, hierarchical, part-whole compositional structure of natural data, that Sparsey also possesses haLLMarks 2 and 4.