Energy-efficient recurrent networks with competitive performance on neuromorphic hardware
Determine whether recurrent neural network models—such as echo-state networks, liquid-state machines, and state-space models—can be implemented on neuromorphic systems with energy characteristics arising from scale-free connectivity (constant fan-in) or converging network activity (decreasing firing rates over time) while maintaining competitive task performance.
References
To date it remains an open question whether such recurrent networks can be realized with these energy characteristics while providing competitive performance.
— Neuromorphic Computing: A Theoretical Framework for Time, Space, and Energy Scaling
(2507.17886 - Aimone, 23 Jul 2025) in Section 4.3 A less-promising example: dense linear algebra-based ANNs (final paragraph)