Online Learning Extreme Learning Machine with Low-Complexity Predictive Plasticity Rule and FPGA Implementation (2512.21777v1)
Abstract: We propose a simplified, biologically inspired predictive local learning rule that eliminates the need for global backpropagation in conventional neural networks and membrane integration in event-based training. Weight updates are triggered only on prediction errors and are performed using sparse, binary-driven vector additions. We integrate this rule into an extreme learning machine (ELM), replacing the conventional computationally intensive matrix inversion. Compared to standard ELM, our approach reduces the complexity of the training from O(M3) to O(M), in terms of M nodes in the hidden layer, while maintaining comparable accuracy (within 3.6% and 2.0% degradation on training and test datasets, respectively). We demonstrate an FPGA implementation and compare it with existing studies, showing significant reductions in computational and memory requirements. This design demonstrates strong potential for energy-efficient online learning on low-cost edge devices.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.