Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
143 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Controlling Recurrent Neural Networks by Diagonal Conceptors (2107.07968v1)

Published 16 Jul 2021 in cs.NE

Abstract: The human brain is capable of learning, memorizing, and regenerating a panoply of temporal patterns. A neuro-dynamical mechanism called conceptors offers a method for controlling the dynamics of a recurrent neural network by which a variety of temporal patterns can be learned and recalled. However, conceptors are matrices whose size scales quadratically with the number of neurons in the recurrent neural network, hence they quickly become impractical. In the work reported in this thesis, a variation of conceptors is introduced, called diagonal conceptors, which are diagonal matrices, thus reducing the computational cost drastically. It will be shown that diagonal conceptors achieve the same accuracy as conceptors, but are slightly more unstable. This instability can be improved, but requires further research. Nevertheless, diagonal conceptors show to be a promising practical alternative to the standard full matrix conceptors.

Summary

We haven't generated a summary for this paper yet.