Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A New Wave in Robotics: Survey on Recent mmWave Radar Applications in Robotics (2305.01135v4)

Published 2 May 2023 in cs.RO

Abstract: We survey the current state of millimeterwave (mmWave) radar applications in robotics with a focus on unique capabilities, and discuss future opportunities based on the state of the art. Frequency Modulated Continuous Wave (FMCW) mmWave radars operating in the 76--81GHz range are an appealing alternative to lidars, cameras and other sensors operating in the near visual spectrum. Radar has been made more widely available in new packaging classes, more convenient for robotics and its longer wavelengths have the ability to bypass visual clutter such as fog, dust, and smoke. We begin by covering radar principles as they relate to robotics. We then review the relevant new research across a broad spectrum of robotics applications beginning with motion estimation, localization, and mapping. We then cover object detection and classification, and then close with an analysis of current datasets and calibration techniques that provide entry points into radar research.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (138)
  1. M. Schneider, “Automotive radar-status and trends,” in German microwave conference, 2005, pp. 144–147.
  2. R. Mur-Artal, J. M. M. Montiel, and J. D. Tardós, “ORB-SLAM: A Versatile and Accurate Monocular SLAM System,” IEEE Trans. Robot., vol. 31, no. 5, pp. 1147–1163, 2015.
  3. J. Engel, V. Koltun, and D. Cremers, “Direct sparse odometry,” IEEE Trans. Pattern Analysis and Machine Intell., vol. 40, no. 3, pp. 611–625, 2017.
  4. J. Zhang and S. Singh, “LOAM: Lidar Odometry and Mapping in Real-time,” in Proc. Robot.: Science & Sys. Conf., 2014.
  5. T. Shan, B. Englot, D. Meyers, W. Wang, C. Ratti, and R. Daniela, “LIO-SAM: tightly-coupled lidar inertial odometry via smoothing and mapping,” in Proc. IEEE/RSJ Intl. Conf. on Intell. Robots and Sys., 2020, pp. 5135–5142.
  6. J. Kim, M.-H. Jeon, Y. Cho, and A. Kim, “Dark synthetic vision: Lightweight active vision to navigate in the dark,” IEEE Robot. and Automat. Lett., vol. 6, no. 1, pp. 143–150, 2020.
  7. J. Kim, Y. Cho, and A. Kim, “Proactive camera attribute control using bayesian optimization for illumination-resilient visual navigation,” IEEE Trans. Robot., vol. 36, no. 4, pp. 1256–1271, 2020.
  8. A. Kramer and C. Heckman, “Radar-inertial state estimation and obstacle detection for micro-aerial vehicles in dense fog,” in International Symposium on Experimental Robotics.   Springer, 2020, pp. 3–16.
  9. J. Starr and B. Lattimer, “Evaluation of navigation sensors in fire smoke environments,” Fire Tech., vol. 50, pp. 1459–1481, 2013.
  10. “Radio detection and ranging,” Nature, vol. 152, no. 3857, pp. 391–392, 1943.
  11. R. C. Whiton, P. L. Smith, S. G. Bigler, K. E. Wilk, and A. C. Harbuck, “History of operational use of weather radar by us weather services. part i: The pre-nexrad era,” Weather and Forecasting, vol. 13, no. 2, pp. 219–243, 1998.
  12. K. G. Jansky, “Electrical disturbances apparently of extraterrestrial origin,” Proc. of the Inst. of Radio Eng., vol. 21, no. 10, pp. 1387–1398, 1933.
  13. M. Mokuno, I. Kawano, and T. Suzuki, “In-orbit demonstration of rendezvous laser radar for unmanned autonomous rendezvous docking,” IEEE Trans. Aerospace and Electronic Sys., vol. 40, no. 2, pp. 617–626, 2004.
  14. A. Venon, Y. Dupuis, P. Vasseur, and P. Merriaux, “Millimeter wave fmcw radars for perception, recognition and localization in automotive applications: A survey,” IEEE Trans. Intell. Vehicles, vol. 7, no. 3, pp. 533–555, 2022.
  15. A. Pearce, J. A. Zhang, R. Xu, and K. Wu, “Multi-object tracking with mmwave radar: A review,” Electronics, vol. 12, no. 2, p. 308, 2023.
  16. K. Burnett, D. J. Yoon, Y. Wu, A. Z. Li, H. Zhang, S. Lu, J. Qian, W.-K. Tseng, A. Lambert, K. Y. Leung, A. P. Schoellig, and T. D. Barfoot, “Boreas: A multi-season autonomous driving dataset,” arXiv preprint arXiv:2203.10168, 2022.
  17. M. Adams, W. S. Wijesoma, and A. Shacklock, “Autonomous navigation: Achievements in complex environments,” IEEE Instrum. and Meas. Magn., vol. 10, no. 3, pp. 15–21, 2007.
  18. N. Radwan, W. Burgard, and A. Valada, “Multimodal interaction-aware motion prediction for autonomous street crossing,” Intl. J. of Robot. Research, vol. 39, no. 13, pp. 1567–1598, 2020.
  19. G. Brooker, R. Hennessey, C. Lobsey, M. Bishop, and E. Widzyk-Capehart, “Seeing through dust and water vapor: Millimeter wave radar sensors for mining applications,” J. of Field Robot., vol. 24, no. 7, pp. 527–557, 2007.
  20. F. de Ponte Müller, “Survey on ranging sensors and cooperative techniques for relative positioning of vehicles,” IEEE Sensors J., vol. 17, no. 2, p. 271, 2017.
  21. F. Schuster, C. G. Keller, M. Rapp, M. Haueis, and C. Curio, “Landmark based radar slam using graph optimization,” in Proc. IEEE Intell. Transport. Sys. Conf., 2016, pp. 2559–2564.
  22. F. Schuster, M. Wörner, C. G. Keller, M. Haueis, and C. Curio, “Robust localization based on radar signal clustering,” in Proc. IEEE Intell. Vehicle Symposium, 2016, pp. 839–844.
  23. S. H. Cen and P. Newman, “Precise ego-motion estimation with millimeter-wave radar under diverse and challenging conditions,” in Proc. IEEE Intl. Conf. on Robot. and Automat., 2018, pp. 6045–6052.
  24. ——, “Radar-only ego-motion estimation in difficult settings via graph matching,” in Proc. IEEE Intl. Conf. on Robot. and Automat., 2019, pp. 298–304.
  25. K. Burnett, A. P. Schoellig, and T. D. Barfoot, “Do we need to compensate for motion distortion and doppler effects in spinning radar navigation?” IEEE Robot. and Automat. Lett., vol. 6, no. 2, pp. 771–778, 2021.
  26. M. Ester, H.-P. Kriegel, J. Sander, X. Xu et al., “A density-based algorithm for discovering clusters in large spatial databases with noise.” in kdd, vol. 96, no. 34, 1996, pp. 226–231.
  27. M. Ankerst, M. M. Breunig, H.-P. Kriegel, and J. Sander, “Optics: Ordering points to identify the clustering structure,” ACM Sigmod record, vol. 28, no. 2, pp. 49–60, 1999.
  28. R. Nitzberg, “Constant-false-alarm-rate signal processors for several types of interference,” IEEE Trans. Aerospace and Electronic Sys., no. 1, pp. 27–34, 1972.
  29. H. Finn, “Adaptive detection mode with threshold control as a function of spatially sampled clutter-level estimates,” Rca Rev., vol. 29, pp. 414–465, 1968.
  30. H. Rohling, “Radar cfar thresholding in clutter and multiple target situations,” IEEE Trans. Robot., no. 4, pp. 608–621, 1983.
  31. ——, “New cfar-processor based on an ordered statistic,” in Intl. Radar Conf., 1985, pp. 271–275.
  32. J. T. Rickard and G. M. Dillard, “Adaptive detection algorithms for multiple-target situations,” IEEE Trans. Aerospace and Electronic Sys., no. 4, pp. 338–343, 1977.
  33. J. A. Ritcey, “Performance analysis of the censored mean-level detector,” IEEE Trans. Aerospace and Electronic Sys., no. 4, pp. 443–454, 1986.
  34. D. Kellner, M. Barjenbruch, J. Klappstein, J. Dickmann, and K. Dietmayer, “Instantaneous ego-motion estimation using doppler radar,” in Proc. IEEE Intell. Transport. Sys. Conf., 2013, pp. 869–874.
  35. J. Mullane, B.-N. Vo, M. D. Adams, and W. S. Wijesoma, “A random set formulation for bayesian slam,” in Proc. IEEE/RSJ Intl. Conf. on Intell. Robots and Sys., 2008, pp. 1043–1049.
  36. J. Mullane, B.-N. Vo, M. Adams, and W. S. Wijesoma, “A random set approach to slam,” in proceedings of the IEEE International Conference on Robotics and Automation (ICRA) workshop on Visual Mapping and Navigation in Outdoor Environments, 2009.
  37. J. Mullane, B.-N. Vo, and M. D. Adams, “Rao-blackwellised phd slam,” in Proc. IEEE Intl. Conf. on Robot. and Automat., 2010, pp. 5410–5416.
  38. J. Mullane, B.-N. Vo, M. D. Adams, and B.-T. Vo, “A random-finite-set approach to bayesian slam,” IEEE Trans. Robot., vol. 27, no. 2, pp. 268–282, 2011.
  39. E. Jose, M. Adams, J. S. Mullane, and N. M. Patrikalakis, “Predicting millimeter wave radar spectra for autonomous navigation,” IEEE Sensors J., vol. 10, no. 5, pp. 960–971, 2010.
  40. J. Mullane, M. D. Adams, and W. S. Wijesoma, “Evidential versus bayesian estimation for radar map building,” in Intl. Conf. on Control, Automat., Robot. and Vis.   IEEE, 2006, pp. 1–8.
  41. J. Mullane, E. Jose, M. D. Adams, and W. S. Wijesoma, “Including probabilistic target detection attributes into map representations,” Robot. and Autonomous Sys., vol. 55, no. 1, pp. 72–85, 2007.
  42. S. Thrun and M. Montemerlo, “The graph slam algorithm with applications to large-scale mapping of urban structures,” Intl. J. of Robot. Research, vol. 25, no. 5-6, pp. 403–429, 2006.
  43. M. A. Fischler and R. C. Bolles, “Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography,” Communications of the ACM, vol. 24, no. 6, pp. 381–395, 1981.
  44. M. Rapp, K. Dietmayer, M. Hahn, F. Schuster, J. Lombacher, and J. Dickmann, “Fscd and basd: Robust landmark detection and description on radar-based grids,” in IEEE MTT-S International Conference on Microwaves for Intelligent Mobility (ICMIM), 2016, pp. 1–4.
  45. M. Rapp, M. Barjenbruch, M. Hahn, J. Dickmann, and K. Dietmayer, “Probabilistic ego-motion estimation using multiple automotive radar sensors,” Robot. and Autonomous Sys., vol. 89, pp. 136–146, 2017.
  46. H. Lim, K. Han, G. Shin, G. Kim, S. Hong, and H. Myung, “Orora: Outlier-robust radar odometry,” arXiv preprint arXiv:2303.01876, 2023.
  47. P. Gao, S. Zhang, W. Wang, and C. X. Lu, “Dc-loc: Accurate automotive radar based metric localization with explicit doppler compensation,” in Proc. IEEE Intl. Conf. on Robot. and Automat., 2022, pp. 4128–4134.
  48. D. Lühr and M. Adams, “Radar noise reduction based on binary integration,” IEEE Sensors J., vol. 15, no. 2, pp. 766–777, 2014.
  49. A. Alhashimi, D. Adolfsson, M. Magnusson, H. Andreasson, and A. J. Lilienthal, “Bfar-bounded false alarm rate detector for improved radar odometry estimation,” arXiv preprint arXiv:2109.09669, 2021.
  50. D. Adolfsson, M. Magnusson, A. Alhashimi, A. J. Lilienthal, and H. Andreasson, “Cfear radarodometry-conservative filtering for efficient and accurate radar odometry,” in Proc. IEEE/RSJ Intl. Conf. on Intell. Robots and Sys., 2021, pp. 5462–5469.
  51. R. Aldera, M. Gadd, D. De Martini, and P. Newman, “What goes around: Leveraging a constant-curvature motion constraint in radar odometry,” IEEE Robot. and Automat. Lett., 2022.
  52. D. Adolfsson, M. Castellano-Quero, M. Magnusson, A. J. Lilienthal, and H. Andreasson, “Coral: Introspection for robust radar and lidar perception in diverse environments using differential entropy,” Robot. and Autonomous Sys., p. 104136, 2022.
  53. D. Kellner, M. Barjenbruch, J. Klappstein, J. Dickmann, and K. Dietmayer, “Instantaneous ego-motion estimation using multiple doppler radars,” in Proc. IEEE Intl. Conf. on Robot. and Automat., 2014, pp. 1592–1597.
  54. A. Kramer, C. Stahoviak, A. Santamaria-Navarro, A.-A. Agha-Mohammadi, and C. Heckman, “Radar-inertial ego-velocity estimation for visually degraded environments,” in Proc. IEEE Intl. Conf. on Robot. and Automat., 2020, pp. 5739–5746.
  55. Y. S. Park, Y.-S. Shin, J. Kim, and A. Kim, “3d ego-motion estimation using low-cost mmwave radars via radar velocity factor for pose-graph slam,” IEEE Robot. and Automat. Lett., vol. 6, no. 4, pp. 7691–7698, 2021.
  56. J. Michalczyk, R. Jung, and S. Weiss, “Tightly-coupled ekf-based radar-inertial odometry,” in 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2022, pp. 12 336–12 343.
  57. Y. S. Park, Y.-S. Shin, and A. Kim, “Pharao: Direct radar odometry using phase correlation,” in Proc. IEEE Intl. Conf. on Robot. and Automat., 2020, pp. 2617–2623.
  58. P.-C. Kung, C.-C. Wang, and W.-C. Lin, “A normal distribution transform-based radar odometry designed for scanning and automotive radars,” in Proc. IEEE Intl. Conf. on Robot. and Automat., 2021, pp. 14 417–14 423.
  59. S. T. Isele, F. Haas-Fickinger, and J. M. Zöllner, “Seraloc: Slam on semantically annotated radar point-clouds,” in Proc. IEEE Intell. Transport. Sys. Conf., 2021, pp. 2917–2924.
  60. K. Haggag, S. Lange, T. Pfeifer, and P. Protzel, “A credible and robust approach to ego-motion estimation using an automotive radar,” IEEE Robot. and Automat. Lett., vol. 7, no. 3, pp. 6020–6027, 2022.
  61. R. Aldera, D. De Martini, M. Gadd, and P. Newman, “Fast radar motion estimation with a learnt focus of attention using weak supervision,” in Proc. IEEE Intl. Conf. on Robot. and Automat., 2019, pp. 1190–1196.
  62. D. Barnes, R. Weston, and I. Posner, “Masking by moving: Learning distraction-free radar odometry from pose information,” arXiv preprint arXiv:1909.03752, 2019.
  63. R. Weston, M. Gadd, D. De Martini, P. Newman, and I. Posner, “Fast-mbym: Leveraging translational invariance of the fourier transform for efficient and accurate radar odometry,” in Proc. IEEE Intl. Conf. on Robot. and Automat., 2022, pp. 2186–2192.
  64. Y. Almalioglu, M. Turan, C. X. Lu, N. Trigoni, and A. Markham, “Milli-rio: Ego-motion estimation with low-cost millimetre-wave radar,” IEEE Sensors J., vol. 21, no. 3, pp. 3314–3323, 2020.
  65. C. X. Lu, M. R. U. Saputra, P. Zhao, Y. Almalioglu, P. P. de Gusmao, C. Chen, K. Sun, N. Trigoni, and A. Markham, “milliego: single-chip mmwave radar aided egomotion estimation via deep sensor fusion,” in Proceedings of the Conference on Embedded Networked Sensor Systems, 2020, pp. 109–122.
  66. C. Chen, X. Lu, A. Markham, and N. Trigoni, “Ionet: Learning to cure the curse of drift in inertial odometry,” in Proc. AAAI National Conf. on Art. Intell., vol. 32, no. 1, 2018.
  67. S. Wang, R. Clark, H. Wen, and N. Trigoni, “Deepvo: Towards end-to-end visual odometry with deep recurrent convolutional neural networks,” in Proc. IEEE Intl. Conf. on Robot. and Automat., 2017, pp. 2043–2050.
  68. D. Barnes and I. Posner, “Under the radar: Learning to predict robust keypoints for odometry estimation and metric localisation in radar,” in Proc. IEEE Intl. Conf. on Robot. and Automat., 2020, pp. 9484–9490.
  69. K. Burnett, D. J. Yoon, A. P. Schoellig, and T. D. Barfoot, “Radar odometry combining probabilistic estimation and unsupervised feature learning,” arXiv preprint arXiv:2105.14152, 2021.
  70. T. D. Barfoot, J. R. Forbes, and D. J. Yoon, “Exactly sparse gaussian variational inference with application to derivative-free batch nonlinear state estimation,” Intl. J. of Robot. Research, vol. 39, no. 13, pp. 1473–1502, 2020.
  71. F. Ding, Z. Pan, Y. Deng, J. Deng, and C. X. Lu, “Self-supervised scene flow estimation with 4d automotive radar,” arXiv preprint arXiv:2203.01137, 2022.
  72. J.-T. Huang, C.-L. Lu, P.-K. Chang, C.-I. Huang, C.-C. Hsu, P.-J. Huang, H.-C. Wang et al., “Cross-modal contrastive learning of representations for navigation using lightweight, low-cost millimeter wave radar for adverse environmental conditions,” IEEE Robot. and Automat. Lett., vol. 6, no. 2, pp. 3333–3340, 2021.
  73. Z. Hong, Y. Petillot, and S. Wang, “Radarslam: Radar based large-scale slam in all weathers,” in Proc. IEEE/RSJ Intl. Conf. on Intell. Robots and Sys., 2020, pp. 5164–5170.
  74. H. Jang, M. Jung, and A. Kim, “Raplace: Place recognition for imaging radar using radon transform and mutable threshold,” arXiv preprint arXiv:2307.04321, 2023.
  75. D. Adolfsson, M. Karlsson, V. Kubelka, M. Magnusson, and H. Andreasson, “Tbv radar slam–trust but verify loop candidates,” arXiv e-prints, pp. arXiv–2301, 2023.
  76. Ş. Săftescu, M. Gadd, D. De Martini, D. Barnes, and P. Newman, “Kidnapped radar: Topological radar localisation using rotationally-invariant metric learning,” in Proc. IEEE Intl. Conf. on Robot. and Automat., 2020, pp. 4358–4364.
  77. M. Gadd, D. De Martini, and P. Newman, “Look around you: Sequence-based radar place recognition with learned rotational invariance,” in IEEE/ION Position, Location and Navigation Symposium (PLANS), 2020, pp. 270–276.
  78. M. J. Milford and G. F. Wyeth, “Seqslam: Visual route-based navigation for sunny summer days and stormy winter nights,” in Proc. IEEE Intl. Conf. on Robot. and Automat., 2012, pp. 1643–1649.
  79. W. Wang, P. P. de Gusmao, B. Yang, A. Markham, and N. Trigoni, “Radarloc: Learning to relocalize in fmcw radar,” in Proc. IEEE Intl. Conf. on Robot. and Automat., 2021, pp. 5809–5815.
  80. G. Huang, Z. Liu, L. Van Der Maaten, and K. Q. Weinberger, “Densely connected convolutional networks,” in Proc. IEEE Conf. on Comput. Vision and Pattern Recog., 2017, pp. 4700–4708.
  81. K. Cait, B. Wang, and C. X. Lu, “Autoplace: Robust place recognition with single-chip automotive radar,” in Proc. IEEE Intl. Conf. on Robot. and Automat., 2022, pp. 2222–2228.
  82. H. Yin, X. Xu, Y. Wang, and R. Xiong, “Radar-to-lidar: Heterogeneous place recognition via joint learning,” Frontiers in Robotics and AI, vol. 8, p. 101, 2021.
  83. K. Burnett, Y. Wu, D. J. Yoon, A. P. Schoellig, and T. D. Barfoot, “Are we ready for radar to replace lidar in all-weather mapping and localization?” arXiv preprint arXiv:2203.10174, 2022.
  84. T. Y. Tang, D. De Martini, D. Barnes, and P. Newman, “Rsl-net: Localising in satellite images from a radar on the ground,” IEEE Robot. and Automat. Lett., vol. 5, no. 2, pp. 1087–1094, 2020.
  85. P. Fritsche, S. Kueppers, G. Briese, and B. Wagner, “Radar and lidar sensorfusion in low visibility environments.” in ICINCO, 2016, pp. 30–36.
  86. D. Nuss, S. Reuter, M. Thom, T. Yuan, G. Krehl, M. Maile, A. Gern, and K. Dietmayer, “A random finite set approach for dynamic occupancy grid maps with real-time application,” Intl. J. of Robot. Research, vol. 37, no. 8, pp. 841–866, 2018.
  87. R. Xu, W. Dong, A. Sharma, and M. Kaess, “Learned depth estimation of 3d imaging radar for indoor mapping,” in Proc. IEEE/RSJ Intl. Conf. on Intell. Robots and Sys., 2022, pp. 13 260–13 267.
  88. Y. Cheng, J. Su, M. Jiang, and Y. Liu, “A novel radar point cloud generation method for robot environment perception,” IEEE Trans. Robot., 2022.
  89. A. Hornung, K. M. Wurm, M. Bennewitz, C. Stachniss, and W. Burgard, “Octomap: An efficient probabilistic 3d mapping framework based on octrees,” Autonomous Robots, vol. 34, no. 3, pp. 189–206, 2013.
  90. C. X. Lu, S. Rosa, P. Zhao, B. Wang, C. Chen, J. A. Stankovic, N. Trigoni, and A. Markham, “See through smoke: robust indoor mapping with low-cost mmwave radar,” in Proceedings of the International Conference on Mobile Systems, Applications, and Services, 2020, pp. 14–27.
  91. J. Behley, M. Garbade, A. Milioto, J. Quenzel, S. Behnke, C. Stachniss, and J. Gall, “Semantickitti: A dataset for semantic scene understanding of lidar sequences,” in Proc. IEEE Intl. Conf. on Comput. Vision, 2019, pp. 9297–9307.
  92. V. Nordenmark and A. Forsgren, “Radar-detection based classification of moving objects using machine learning methods,” 2015.
  93. C. Cortes and V. Vapnik, “Support-vector networks,” Machine learning, vol. 20, no. 3, pp. 273–297, 1995.
  94. S. Wold, K. Esbensen, and P. Geladi, “Principal component analysis,” Chemometrics and intelligent laboratory systems, vol. 2, no. 1-3, pp. 37–52, 1987.
  95. A. Danzer, T. Griebel, M. Bach, and K. Dietmayer, “2d car detection in radar data with pointnets,” in Proc. IEEE Intell. Transport. Sys. Conf., 2019, pp. 61–66.
  96. P. Zhao, C. X. Lu, J. Wang, C. Chen, W. Wang, N. Trigoni, and A. Markham, “mid: Tracking and identifying people with millimeter wave radar,” in 2019 15th International Conference on Distributed Computing in Sensor Systems (DCOSS).   IEEE, 2019, pp. 33–40.
  97. M. Zeller, J. Behley, M. Heidingsfeld, and C. Stachniss, “Gaussian radar transformer for semantic segmentation in noisy radar data,” IEEE Robot. and Automat. Lett., vol. 8, no. 1, pp. 344–351, 2022.
  98. B. Major, D. Fontijne, A. Ansari, R. Teja Sukhavasi, R. Gowaikar, M. Hamilton, S. Lee, S. Grzechnik, and S. Subramanian, “Vehicle detection with automotive radar using deep learning on range-azimuth-doppler tensors,” in Proceedings of the IEEE/CVF International Conference on Computer Vision Workshops, 2019.
  99. T. Akita and S. Mita, “Object tracking and classification using millimeter-wave radar based on lstm,” in Proc. IEEE Intell. Transport. Sys. Conf., 2019, pp. 1110–1115.
  100. S. Hochreiter and J. Schmidhuber, “Long short-term memory,” Neural computation, vol. 9, no. 8, pp. 1735–1780, 1997.
  101. A. Palffy, J. Dong, J. F. Kooij, and D. M. Gavrila, “Cnn based road user detection using the 3d radar cube,” IEEE Robotics and Automation Letters, vol. 5, no. 2, pp. 1263–1270, 2020.
  102. Y. Wang, Z. Jiang, X. Gao, J.-N. Hwang, G. Xing, and H. Liu, “Rodnet: Radar object detection using cross-modal supervision,” in Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, 2021, pp. 504–513.
  103. J. Lien, N. Gillian, M. E. Karagozler, P. Amihood, C. Schwesig, E. Olson, H. Raja, and I. Poupyrev, “Soli: Ubiquitous gesture sensing with millimeter wave radar,” ACM Transactions on Graphics (TOG), vol. 35, no. 4, pp. 1–19, 2016.
  104. S. Chang, Y. Zhang, F. Zhang, X. Zhao, S. Huang, Z. Feng, and Z. Wei, “Spatial attention fusion for obstacle detection using mmwave radar and vision sensor,” IEEE Sensors J., vol. 20, no. 4, p. 956, 2020.
  105. S. Sugimoto, H. Tateda, H. Takahashi, and M. Okutomi, “Obstacle detection using millimeter-wave radar and its visualization on image sequence,” in Proc. Intl. Conf. Pattern Recog., vol. 3, 2004, pp. 342–345.
  106. Y. Fang, I. Masaki, and B. Horn, “Depth-based target segmentation for intelligent vehicles: Fusion of radar and binocular stereo,” IEEE Trans. Intell. Transport. Sys., vol. 3, no. 3, pp. 196–202, 2002.
  107. L. Bombini, P. Cerri, P. Medici, and G. Alessandretti, “Radar-vision fusion for vehicle detection,” in Proceedings of International Workshop on Intelligent Transportation, vol. 65, 2006, p. 70.
  108. S. Chadwick, W. Maddern, and P. Newman, “Distant vehicle detection using radar and vision,” in Proc. IEEE Intl. Conf. on Robot. and Automat., 2019, pp. 8311–8317.
  109. M. Meyer and G. Kuschk, “Deep learning based 3d object detection for automotive radar and camera,” in European Radar Conference (EuRAD), 2019, pp. 133–136.
  110. K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” in Proc. IEEE Conf. on Comput. Vision and Pattern Recog., 2016, pp. 770–778.
  111. V. John and S. Mita, “Rvnet: deep sensor fusion of monocular camera and radar for image-based obstacle detection in challenging environments,” in Pacific-Rim Symposium on Image and Video Technology.   Springer, 2019, pp. 351–364.
  112. J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, “You only look once: Unified, real-time object detection,” in Proc. IEEE Conf. on Comput. Vision and Pattern Recog., 2016, pp. 779–788.
  113. K. Kowol, M. Rottmann, S. Bracke, and H. Gottschalk, “Yodar: Uncertainty-based sensor fusion for vehicle detection with camera and radar sensors,” arXiv preprint arXiv:2010.03320, 2020.
  114. J. Redmon and A. Farhadi, “Yolov3: An incremental improvement,” arXiv preprint arXiv:1804.02767, 2018.
  115. F. Nobis, M. Geisslinger, M. Weber, J. Betz, and M. Lienkamp, “A deep learning-based radar and camera sensor fusion architecture for object detection,” in Sensor Data Fusion: Trends, Solutions, Applications (SDF), 2019, pp. 1–7.
  116. B. Yang, R. Guo, M. Liang, S. Casas, and R. Urtasun, “Radarnet: Exploiting radar for robust perception of dynamic objects,” in Proc. European Conf. on Comput. Vision, 2020, pp. 496–512.
  117. D. Barnes, M. Gadd, P. Murcutt, P. Newman, and I. Posner, “The oxford radar robotcar dataset: A radar extension to the oxford robotcar dataset,” in Proc. IEEE Intl. Conf. on Robot. and Automat., 2020, pp. 6433–6438.
  118. G. Kim, Y. S. Park, Y. Cho, J. Jeong, and A. Kim, “Mulran: Multimodal range dataset for urban place recognition,” in Proc. IEEE Intl. Conf. on Robot. and Automat., 2020, pp. 6246–6253.
  119. M. Sheeny, E. De Pellegrin, S. Mukherjee, A. Ahrabian, S. Wang, and A. Wallace, “Radiate: A radar dataset for automotive perception in bad weather,” in Proc. IEEE Intl. Conf. on Robot. and Automat., 2021, pp. 1–7.
  120. H. Caesar, V. Bankiti, A. H. Lang, S. Vora, V. E. Liong, Q. Xu, A. Krishnan, Y. Pan, G. Baldan, and O. Beijbom, “nuscenes: A multimodal dataset for autonomous driving,” in Proc. IEEE Conf. on Comput. Vision and Pattern Recog., 2020, pp. 11 621–11 631.
  121. O. Schumann, M. Hahn, N. Scheiner, F. Weishaupt, J. F. Tilly, J. Dickmann, and C. Wöhler, “Radarscenes: A real-world radar point cloud data set for automotive applications,” in IEEE International Conference on Information Fusion (FUSION), 2021, pp. 1–8.
  122. A. Kramer, K. Harlow, C. Williams, and C. Heckman, “Coloradar: The direct 3d millimeter wave radar dataset,” arXiv preprint arXiv:2103.04510, 2021.
  123. D.-H. Paek, S.-H. Kong, and K. T. Wijaya, “K-radar: 4d radar object detection for autonomous driving in various weather conditions,” in Conf. on Neural Information Processing Systems Datasets and Benchmarks Track, 2022.
  124. A. Palffy, E. Pool, S. Baratam, J. F. Kooij, and D. M. Gavrila, “Multi-class road user detection with 3+ 1d radar in the view-of-delft dataset,” IEEE Robotics and Automation Letters, vol. 7, no. 2, pp. 4961–4968, 2022.
  125. W. Maddern, G. Pascoe, C. Linegar, and P. Newman, “1 year, 1000 km: The oxford robotcar dataset,” Intl. J. of Robot. Research, vol. 36, no. 1, pp. 3–15, 2017.
  126. A. H. Lang, S. Vora, H. Caesar, L. Zhou, J. Yang, and O. Beijbom, “Pointpillars: Fast encoders for object detection from point clouds,” in Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 2019, pp. 12 697–12 705.
  127. M. Dudek, R. Wahl, D. Kissinger, R. Weigel, and G. Fischer, “Millimeter wave fmcw radar system simulations including a 3d ray tracing channel simulator,” in 2010 Asia-Pacific Microwave Conference.   IEEE, 2010, pp. 1665–1668.
  128. X. Li, X. Tao, B. Zhu, and W. Deng, “Research on a simulation method of the millimeter wave radar virtual test environment for intelligent driving,” Sensors, vol. 20, no. 7, p. 1929, 2020.
  129. C. Schöffmann, B. Ubezio, C. Böhm, S. Mühlbacher-Karrer, and H. Zangl, “Virtual radar: Real-time millimeter-wave radar sensor simulation for perception-driven robotics,” IEEE Robotics and Automation Letters, vol. 6, no. 3, pp. 4704–4711, 2021.
  130. J. Peršić, I. Marković, and I. Petrović, “Extrinsic 6dof calibration of a radar–lidar–camera system enhanced by radar cross section estimates evaluation,” Robot. and Autonomous Sys., vol. 114, pp. 217–230, 2019.
  131. C.-L. Lee, Y.-H. Hsueh, C.-C. Wang, and W.-C. Lin, “Extrinsic and temporal calibration of automotive radar and 3d lidar,” in Proc. IEEE/RSJ Intl. Conf. on Intell. Robots and Sys., 2020, pp. 9976–9983.
  132. C. Schöller, M. Schnettler, A. Krämmer, G. Hinz, M. Bakovic, M. Güzet, and A. Knoll, “Targetless rotational auto-calibration of radar and camera for intelligent transportation systems,” in Proc. IEEE Intell. Transport. Sys. Conf., 2019, pp. 3934–3941.
  133. E. Wise, J. Peršić, C. Grebe, I. Petrović, and J. Kelly, “A continuous-time approach for 3d radar-to-camera extrinsic calibration,” in Proc. IEEE Intl. Conf. on Robot. and Automat., 2021, pp. 13 164–13 170.
  134. J. Peršić, L. Petrović, I. Marković, and I. Petrović, “Online multi-sensor calibration based on moving object tracking,” Advanced Robotics, vol. 35, no. 3-4, pp. 130–140, 2021.
  135. ——, “Spatiotemporal multisensor calibration via gaussian processes moving target tracking,” IEEE Trans. Robot., vol. 37, no. 5, pp. 1401–1415, 2021.
  136. E. Wise, Q. Cheng, and J. Kelly, “Spatiotemporal calibration of 3d mm-wavelength radar-camera pairs,” arXiv preprint arXiv:2211.01871, 2022.
  137. Q. Cheng, E. Wise, and J. Kelly, “Extrinsic calibration of 2d mm-wavelength radar pairs using ego-velocity estimates,” arXiv preprint arXiv:2302.00660, 2023.
  138. A. Geiger, P. Lenz, C. Stiller, and R. Urtasun, “Vision meets robotics: The kitti dataset,” The International Journal of Robotics Research, vol. 32, no. 11, pp. 1231–1237, 2013.
Citations (25)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com