Distance-Forward Learning: Enhancing the Forward-Forward Algorithm Towards High-Performance On-Chip Learning (2408.14925v1)
Abstract: The Forward-Forward (FF) algorithm was recently proposed as a local learning method to address the limitations of backpropagation (BP), offering biological plausibility along with memory-efficient and highly parallelized computational benefits. However, it suffers from suboptimal performance and poor generalization, largely due to inadequate theoretical support and a lack of effective learning strategies. In this work, we reformulate FF using distance metric learning and propose a distance-forward algorithm (DF) to improve FF performance in supervised vision tasks while preserving its local computational properties, making it competitive for efficient on-chip learning. To achieve this, we reinterpret FF through the lens of centroid-based metric learning and develop a goodness-based N-pair margin loss to facilitate the learning of discriminative features. Furthermore, we integrate layer-collaboration local update strategies to reduce information loss caused by greedy local parameter updates. Our method surpasses existing FF models and other advanced local learning approaches, with accuracies of 99.7\% on MNIST, 88.2\% on CIFAR-10, 59\% on CIFAR-100, 95.9\% on SVHN, and 82.5\% on ImageNette, respectively. Moreover, it achieves comparable performance with less than 40\% memory cost compared to BP training, while exhibiting stronger robustness to multiple types of hardware-related noise, demonstrating its potential for online learning and energy-efficient computation on neuromorphic chips.
- Marginal Contrastive Loss: A Step Forward for Forward-Forward. In 2024 13th Iranian/3rd International Machine Vision and Image Processing Conference, 1–6. IEEE.
- Forward-forward contrastive learning. arXiv preprint arXiv:2305.02927.
- Layer-Wise Learning Framework for Efficient DNN Deployment in Biomedical Wearable Systems. In 2023 IEEE 19th International Conference on Body Sensor Networks (BSN), 1–4. IEEE.
- Center Contrastive Loss for Metric Learning. arXiv preprint arXiv:2308.00458.
- Error-driven input modulation: Solving the credit assignment problem without a backward pass. In International Conference on Machine Learning, 4937–4955. PMLR.
- Triplet loss in siamese network for object tracking. In Proceedings of the European conference on computer vision, 459–474.
- The Trifecta: Three simple techniques for training deeper Forward-Forward networks. arXiv preprint arXiv:2311.18130.
- The combination of Hebbian and predictive plasticity learns invariant object representations in deep sensory networks. Nature Neuroscience, 26(11): 1906–1915.
- Benchmarking neural network robustness to common corruptions and surface variations. arXiv preprint arXiv:1807.01697.
- Hinton, G. 2022. The forward-forward algorithm: Some preliminary investigations. arXiv preprint arXiv:2212.13345.
- A survey on contrastive self-supervised learning. Technologies, 9(1): 2.
- Hebbian deep learning without feedback. arXiv preprint arXiv:2209.11883.
- Signal propagation: The framework for learning and inference in a forward pass. IEEE Transactions on Neural Networks and Learning Systems.
- Laborieux A, Z. F. 2022. Holomorphic equilibrium propagation computes exact gradients through finite size oscillations. Advances in neural information processing systems, 35.
- Symba: Symmetric backpropagation-free contrastive learning with forward-forward algorithm for optimizing convergence. arXiv preprint arXiv:2303.08418.
- Contrastive clustering. In Proceedings of the AAAI conference on artificial intelligence, volume 35.
- Random synaptic feedback weights support error backpropagation for deep learning. Nature communications, 7(1): 13276.
- Layer collaboration in the forward-forward algorithm. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 38.
- Scaling Supervised Local Learning with Augmented Auxiliary Networks. In The Twelfth International Conference on Learning Representations.
- Miconi, T. 2021. Hebbian learning with gradients: Hebbian convolutional neural networks with modern deep learning frameworks. arXiv preprint arXiv:2107.01729.
- Nøkland, A. 2016. Direct feedback alignment provides learning in deep neural networks. Advances in neural information processing systems, 29.
- Convolutional Channel-Wise Competitive Learning for the Forward-Forward Algorithm. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 38.
- Suitability of forward-forward and pepita learning to mlcommons-tiny benchmarks. In 2023 IEEE International Conference on Omni-layer Intelligent Systems (COINS), 1–6. IEEE.
- Contrastive-center loss for deep neural networks. In 2017 IEEE international conference on image processing, 2851–2855. IEEE.
- Metric learning with adaptive density discrimination. arXiv preprint arXiv:1511.05939.
- Energy-based learning algorithms for analog computing: a comparative study. Advances in Neural Information Processing Systems, 36.
- Sohn, K. 2016. Improved deep metric learning with multi-class n-pair loss objective. Advances in neural information processing systems, 29.
- Revisiting Locally Supervised Learning: an Alternative to End-to-end Training. In International Conference on Learning Representations.
- Learning the connections in direct feedback alignment. openreview.
- Loco: Local contrastive representation learning. Advances in neural information processing systems, 33.
- Joint unsupervised learning of deep representations and image clusters. In Proceedings of the IEEE conference on computer vision and pattern recognition, 5147–5156.
- Training spiking neural networks with local tandem learning. Advances in Neural Information Processing Systems, 35.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.