Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

The connectivity degree controls the difficulty of RBN reservoir design (2312.08522v1)

Published 13 Dec 2023 in nlin.AO, nlin.CD, nlin.CG, and q-bio.NC

Abstract: Reservoir Computing (RC) is a paradigm in artificial intelligence where a recurrent neural network (RNN) is used to process temporal data, leveraging the inherent dynamical properties of the reservoir to perform complex computations. In the realm of RC, the excitatory-inhibitory balance b has been shown to be pivotal for driving the dynamics and performance of Echo State Networks (ESN) and, more recently, Random Boolean Network (RBN). However, the relationship between b and other parameters of the network is still poorly understood. This article explores how the interplay of the balance b, the connectivity degree K (i.e., the number of synapses per neuron) and the size of the network (i.e., the number of neurons N) influences the dynamics and performance (memory and prediction) of an RBN reservoir. Our findings reveal that K and b are strongly tied in optimal reservoirs. Reservoirs with high K have two optimal balances, one for globally inhibitory networks (b<0), and the other one for excitatory networks (b>0). Both show asymmetric performances about a zero balance. In contrast, for moderate K, the optimal value being K=4, best reservoirs are obtained when excitation and inhibition almost, but not exactly, balance each other. For almost all K, the influence of the size is such that increasing N leads to better performance, even with large values of N. Our investigation provides clear directions to generate optimal reservoirs or reservoirs with constraints on size or connectivity.

Summary

We haven't generated a summary for this paper yet.