Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
98 tokens/sec
GPT-4o
8 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Active Sites model for the B-Matrix Approach (1006.4754v1)

Published 24 Jun 2010 in cs.NE

Abstract: This paper continues on the work of the B-Matrix approach in hebbian learning proposed by Dr. Kak. It reports the results on methods of improving the memory retrieval capacity of the hebbian neural network which implements the B-Matrix approach. Previously, the approach to retrieving the memories from the network was to clamp all the individual neurons separately and verify the integrity of these memories. Here we present a network with the capability to identify the "active sites" in the network during the training phase and use these "active sites" to generate the memories retrieved from these neurons. Three methods are proposed for obtaining the update order of the network from the proximity matrix when multiple neurons are to be clamped. We then present a comparison between the new methods to the classical case and also among the methods themselves.

Citations (5)

Summary

We haven't generated a summary for this paper yet.