Dice Question Streamline Icon: https://streamlinehq.com

Self-contained learning without external readout training in SOMN reservoirs

Develop learning methods for Self-Organising Memristive Networks (SOMNs) that achieve self-contained learning without training an external output layer, thereby enabling in-situ adaptation within the physical substrate.

Information Square Streamline Icon: https://streamlinehq.com

Background

In the standard reservoir computing paradigm, only the output layer is trained while the reservoir remains fixed. In physical implementations with SOMNs, this typically entails software-based training of the readout layer, which partially offsets the benefits of in-materia processing.

The authors explicitly raise the open question of whether SOMNs can support more self-contained learning that does not rely on an externally trained readout layer. Realising such methods could bridge reservoir computing with associative or contrastive in-situ adaptation and reduce external computation.

References

Several open questions remain in physical reservoir computing with SOMNs, including the final energy consumption for real world tasks, and whether methods can be developed for more self-contained learning that does not rely on training an external layer.

Self-Organising Memristive Networks as Physical Learning Systems (2509.00747 - Caravelli et al., 31 Aug 2025) in Section 5.1 (Physical reservoir computing)