Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

CARMA: Context-Aware Runtime Reconfiguration for Energy-Efficient Sensor Fusion (2306.15748v1)

Published 27 Jun 2023 in cs.CV

Abstract: Autonomous systems (AS) are systems that can adapt and change their behavior in response to unanticipated events and include systems such as aerial drones, autonomous vehicles, and ground/aquatic robots. AS require a wide array of sensors, deep-learning models, and powerful hardware platforms to perceive and safely operate in real-time. However, in many contexts, some sensing modalities negatively impact perception while increasing the system's overall energy consumption. Since AS are often energy-constrained edge devices, energy-efficient sensor fusion methods have been proposed. However, existing methods either fail to adapt to changing scenario conditions or to optimize energy efficiency system-wide. We propose CARMA: a context-aware sensor fusion approach that uses context to dynamically reconfigure the computation flow on a Field-Programmable Gate Array (FPGA) at runtime. By clock-gating unused sensors and model sub-components, CARMA significantly reduces the energy used by a multi-sensory object detector without compromising performance. We use a Deep-learning Processor Unit (DPU) based reconfiguration approach to minimize the latency of model reconfiguration. We evaluate multiple context-identification strategies, propose a novel system-wide energy-performance joint optimization, and evaluate scenario-specific perception performance. Across challenging real-world sensing contexts, CARMA outperforms state-of-the-art methods with up to 1.3x speedup and 73% lower energy consumption.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (26)
  1. S.-C. Lin, Y. Zhang et al., “The architectural implications of autonomous driving: Constraints and acceleration,” in ASPLOS 2018.
  2. S. Abuelsamid, “NVIDIA Cranks Up And Turns Down Its Drive AGX Orin Computers,” Forbes, Jun 2020.
  3. X. He, H. Kim et al., “Energy consumption simulation for connected and automated vehicles: Eco-driving benefits versus automation loads,” SAE Int. J. of Connected and Autonomous Vehicles, vol. 6, 2022.
  4. A. V. Malawade, T. Mortlock, and M. A. Al Faruque, “HydraFusion: Context-aware selective sensor fusion for robust and efficient autonomous vehicle perception,” in ICCPS ’22.  IEEE, 2022, pp. 68–79.
  5. A. V. Malawade et al., “EcoFusion: Energy-aware adaptive sensor fusion for efficient autonomous vehicle perception,” in DAC ’22.
  6. Xilinx, “Vitis AI User Guide (UG1414).”
  7. H. Irmak et al., “Increasing Flexibility of FPGA-based CNN Accelerators with Dynamic Partial Reconfiguration,” in FPL ’21.
  8. E. Youssef, H. A. Elsemary et al., “Energy adaptive convolution neural network using dynamic partial reconfiguration,” in MWSCAS 2020.
  9. A. S. Hussein et al., “Implementation of a DPU-based intelligent thermal imaging hardware accelerator on FPGA,” Electronics, 2022.
  10. R. T. Mullapudi, W. R. Mark et al., “HydraNets: Specialized dynamic architectures for efficient inference,” in CVPR ’18, 2018, pp. 8080–8089.
  11. Z. Takhirov, J. Wang et al., “Energy-efficient adaptive classifier design for mobile systems,” in ISLPED ’16, 2016, p. 52–57.
  12. C. Hao, X. Zhang et al., “FPGA/DNN Co-Design: An Efficient Design Methodology for IoT Intelligence on the Edge,” in Proceedings of the 56th Annual Design Automation Conference 2019, ser. DAC ’19, 2019.
  13. D. Balemans et al., “Resource efficient sensor fusion by knowledge-based network pruning,” Internet of Things, vol. 11, p. 100231, 2020.
  14. V. Gokhale et al., “Feel: fast, energy-efficient localization for autonomous indoor vehicles,” in ICC ’21.  IEEE, 2021, pp. 1–6.
  15. C. Huang, S. Xu et al., “Opportunistic intermittent control with safety guarantees for autonomous systems,” in DAC ’20.  IEEE, 2020.
  16. R. Dash et al., “Intermittent control in autonomous vehicle steering control and lane keeping,” in 5th International Conference of The Robotics Society, 2021.
  17. K. Vatanparvar, S. Faezi et al., “Extended range electric vehicle with driving behavior estimation in energy management,” IEEE transactions on Smart Grid, vol. 10, no. 3, pp. 2959–2968, 2018.
  18. D. Baek, Y. Chen et al., “Battery-aware energy model of drone delivery tasks,” in ISLPED ’18, 2018.
  19. N. Radar, “Navtech CTS Series,” May 2021. [Online]. Available: https://navtechradar.com/clearway-technical-specifications/compact-sensors
  20. V. Lidar, “Velodyne HDL-32e Datasheet,” May 2021. [Online]. Available: https://velodynelidar.com/products/hdl-32e/
  21. Stereolabs, “ZED Camera and SDK Overview.” [Online]. Available: https://cdn.stereolabs.com/assets/datasheets/zed-camera-datasheet.pdf
  22. W. Liu, D. Anguelov et al., “SSD: Single shot multibox detector,” in European conference on computer vision.  Springer, 2016, pp. 21–37.
  23. S. Ren, K. He et al., “Faster R-CNN: Towards real-time object detection with region proposal networks,” NIPS 2015.
  24. R. Solovyev, W. Wang, and T. Gabruseva, “Weighted boxes fusion: Ensembling boxes from different object detection models,” Image and Vision Computing, vol. 107, p. 104117, 2021.
  25. M. Sheeny, E. De Pellegrin et al., “RADIATE: A radar dataset for automotive perception,” arXiv preprint arXiv:2010.09076, 2020.
  26. R. Girshick, “Fast R-CNN,” in CVPR ’15, 2015, pp. 1440–1448.
Citations (1)

Summary

We haven't generated a summary for this paper yet.