Efficient Parameter Mining and Freezing for Continual Object Detection
Abstract: Continual Object Detection is essential for enabling intelligent agents to interact proactively with humans in real-world settings. While parameter-isolation strategies have been extensively explored in the context of continual learning for classification, they have yet to be fully harnessed for incremental object detection scenarios. Drawing inspiration from prior research that focused on mining individual neuron responses and integrating insights from recent developments in neural pruning, we proposed efficient ways to identify which layers are the most important for a network to maintain the performance of a detector across sequential updates. The presented findings highlight the substantial advantages of layer-level parameter isolation in facilitating incremental learning within object detection models, offering promising avenues for future research and application in real-world scenarios.
- Efficient lifelong learning with a-gem. arXiv preprint arXiv:1812.00420.
- Mmdetection: Open mmlab detection toolbox and benchmark. arXiv preprint arXiv:1906.07155.
- A continual learning survey: Defying forgetting in classification tasks. IEEE Transactions on Pattern Analysis and Machine Intelligence.
- Embracing change: Continual learning in deep neural networks. Trends in cognitive sciences.
- Overcoming catastrophic forgetting in neural networks. Proceedings of the national academy of sciences, 114(13):3521–3526.
- Optimal brain damage. Advances in neural information processing systems, 2.
- Rilod: Near real-time incremental learning for object detection at the edge. In Proceedings of the 4th ACM/IEEE Symposium on Edge Computing, pages 113–126.
- Pruning filters for efficient convnets. arXiv preprint arXiv:1608.08710.
- Incremental learning of single-stage detectors with mining memory neurons. In 2018 IEEE 4th International Conference on Computer and Communications (ICCC), pages 1981–1985. IEEE.
- Learning without forgetting. IEEE transactions on pattern analysis and machine intelligence, 40(12):2935–2947.
- Focal loss for dense object detection. In Proceedings of the IEEE international conference on computer vision, pages 2980–2988.
- Channel pruning based on mean gradient for accelerating convolutional neural networks. Signal Processing, 156:84–91.
- An entropy-based pruning method for cnn compression. arXiv preprint arXiv:1706.05791.
- Packnet: Adding multiple tasks to a single network by iterative pruning. In Proceedings of the IEEE conference on Computer Vision and Pattern Recognition, pages 7765–7773.
- Continual object detection: A review of definitions, strategies, and challenges. Neural Networks.
- Wide neural networks forget less catastrophically. arXiv preprint arXiv:2110.11526.
- Continual learning for real-world autonomous systems: Algorithms, challenges and frameworks. arXiv preprint arXiv:2105.12374.
- Incremental learning of object detectors without catastrophic forgetting. In Proceedings of the IEEE international conference on computer vision, pages 3400–3409.
- Fcos: A simple and strong anchor-free object detector. IEEE Transactions on Pattern Analysis and Machine Intelligence.
- An incremental learning of yolov3 without catastrophic forgetting for smart city applications. IEEE Consumer Electronics Magazine.
- Filter pruning with a feature map entropy importance criterion for convolution neural networks compressing. Neurocomputing, 461:41–54.
- Recent advances in deep learning for object detection. Neurocomputing, 396:39–64.
- Continual learning through synaptic intelligence. In International conference on machine learning, pages 3987–3995. PMLR.
- Object detection in 20 years: A survey. arXiv preprint arXiv:1905.05055.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.