Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A real-time dynamic obstacle tracking and mapping system for UAV navigation and collision avoidance with an RGB-D camera (2209.08258v4)

Published 17 Sep 2022 in cs.RO and cs.AI

Abstract: The real-time dynamic environment perception has become vital for autonomous robots in crowded spaces. Although the popular voxel-based mapping methods can efficiently represent 3D obstacles with arbitrarily complex shapes, they can hardly distinguish between static and dynamic obstacles, leading to the limited performance of obstacle avoidance. While plenty of sophisticated learning-based dynamic obstacle detection algorithms exist in autonomous driving, the quadcopter's limited computation resources cannot achieve real-time performance using those approaches. To address these issues, we propose a real-time dynamic obstacle tracking and mapping system for quadcopter obstacle avoidance using an RGB-D camera. The proposed system first utilizes a depth image with an occupancy voxel map to generate potential dynamic obstacle regions as proposals. With the obstacle region proposals, the Kalman filter and our continuity filter are applied to track each dynamic obstacle. Finally, the environment-aware trajectory prediction method is proposed based on the Markov chain using the states of tracked dynamic obstacles. We implemented the proposed system with our custom quadcopter and navigation planner. The simulation and physical experiments show that our methods can successfully track and represent obstacles in dynamic environments in real-time and safely avoid obstacles. Our software is available on GitHub as an open-source ROS package.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (29)
  1. J. Nikolic, M. Burri, J. Rehder, S. Leutenegger, C. Huerzeler, and R. Siegwart, “A uav system for inspection of industrial facilities,” in 2013 IEEE Aerospace Conference.   IEEE, 2013, pp. 1–8.
  2. P. Tokekar, J. Vander Hook, D. Mulla, and V. Isler, “Sensor planning for a symbiotic uav and ugv system for precision agriculture,” IEEE Transactions on Robotics, vol. 32, no. 6, pp. 1498–1511, 2016.
  3. Z. Xu, D. Deng, and K. Shimada, “Autonomous uav exploration of dynamic environments via incremental sampling and probabilistic roadmap,” IEEE Robotics and Automation Letters, vol. 6, no. 2, pp. 2729–2736, 2021.
  4. T. Tomic, K. Schmid, P. Lutz, A. Domel, M. Kassecker, E. Mair, I. L. Grixa, F. Ruess, M. Suppa, and D. Burschka, “Toward a fully autonomous uav: Research platform for indoor and outdoor urban search and rescue,” IEEE robotics & automation magazine, vol. 19, no. 3, pp. 46–56, 2012.
  5. Z. Xu, D. Deng, Y. Dong, and K. Shimada, “Dpmpc-planner: A real-time uav trajectory planning framework for complex static environments with dynamic obstacles,” in 2022 International Conference on Robotics and Automation (ICRA).   IEEE, 2022, pp. 250–256.
  6. J. Lin, H. Zhu, and J. Alonso-Mora, “Robust vision-based obstacle avoidance for micro aerial vehicles in dynamic environments,” 2020 IEEE International Conference on Robotics and Automation (ICRA), pp. 2682–2688, 2020.
  7. H. Oleynikova, D. Honegger, and M. Pollefeys, “Reactive avoidance using embedded stereo vision for mav flight,” 2015 IEEE International Conference on Robotics and Automation (ICRA), pp. 50–56, 2015.
  8. T. Eppenberger, G. Cesari, M. Dymczyk, R. Siegwart, and R. Dubé, “Leveraging stereo-camera data for real-time dynamic obstacle detection and tracking,” 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 10 528–10 535, 2020.
  9. L. Tai, S. Li, and M. Liu, “A deep-network solution towards model-less obstacle avoidance,” in 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2016, pp. 2759–2764.
  10. A. Moffatt, E. Platt, B. Mondragon, A. Kwok, D. Uryeu, and S. Bhandari, “Obstacle detection and avoidance system for small uavs using a lidar,” pp. 633–640, 2020.
  11. X. Liu, G. V. Nardari, F. C. Ojeda, Y. Tao, A. Zhou, T. Donnelly, C. Qu, S. W. Chen, R. A. Romero, C. J. Taylor et al., “Large-scale autonomous flight with real-time semantic slam under dense forest canopy,” IEEE Robotics and Automation Letters, vol. 7, no. 2, pp. 5512–5519, 2022.
  12. L. Xie, S. Wang, A. Markham, and N. Trigoni, “Towards monocular vision based obstacle avoidance through deep reinforcement learning,” arXiv preprint arXiv:1706.09829, 2017.
  13. H.-C. Chen, “Monocular vision-based obstacle detection and avoidance for a multicopter,” IEEE Access, vol. 7, pp. 167 869–167 883, 2019.
  14. D. Falanga, K. Kleber, and D. Scaramuzza, “Dynamic obstacle avoidance for quadrotors with event cameras,” Science Robotics, vol. 5, no. 40, p. eaaz9712, 2020.
  15. A. Z. Zhu, D. Thakur, T. Özaslan, B. Pfrommer, V. Kumar, and K. Daniilidis, “The multivehicle stereo event camera dataset: An event camera dataset for 3d perception,” IEEE Robotics and Automation Letters, vol. 3, no. 3, pp. 2032–2039, 2018.
  16. T. Mori and S. Scherer, “First results in detecting and avoiding frontal obstacles from a monocular camera for micro unmanned aerial vehicles,” 05 2013, pp. 1750–1757.
  17. Z. Xu, H. Shi, N. Li, C. Xiang, and H. Zhou, “Vehicle detection under uav based on optimal dense yolo method,” pp. 407–411, 2018.
  18. M. Liu, X. Wang, A. Zhou, X. Fu, Y. Ma, and C. Piao, “Uav-yolo: Small object detection on unmanned aerial vehicle perspective,” Sensors, vol. 20, no. 8, p. 2238, 2020.
  19. Y. Wang, J. Ji, Q. Wang, C. Xu, and F. Gao, “Autonomous flights in dynamic environments with onboard vision,” 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1966–1973, 2021.
  20. G. Chen, W. Dong, P. Peng, J. Alonso-Mora, and X. Zhu, “Continuous occupancy mapping in dynamic environments using particles,” arXiv preprint arXiv:2202.06273, 2022.
  21. Y. Min, D.-U. Kim, and H.-L. Choi, “Kernel-based 3-d dynamic occupancy mapping with particle tracking,” in 2021 IEEE International Conference on Robotics and Automation (ICRA), 2021, pp. 5268–5274.
  22. G. Chen, W. Dong, X. Sheng, X. Zhu, and H. Ding, “An active sense and avoid system for flying robots in dynamic environments,” IEEE/ASME Transactions on Mechatronics, vol. 26, no. 2, pp. 668–678, 2021.
  23. S. Qiao, D. Shen, X. Wang, N. Han, and W. Zhu, “A self-adaptive parameter selection trajectory prediction approach via hidden markov models,” IEEE Transactions on Intelligent Transportation Systems, vol. 16, no. 1, pp. 284–296, 2015.
  24. S. Ragi and E. K. Chong, “Uav path planning in a dynamic environment via partially observable markov decision process,” IEEE Transactions on Aerospace and Electronic Systems, vol. 49, no. 4, pp. 2397–2412, 2013.
  25. M. Althoff, O. Stursberg, and M. Buss, “Model-based probabilistic collision detection in autonomous driving,” IEEE Transactions on Intelligent Transportation Systems, vol. 10, no. 2, pp. 299–310, 2009.
  26. J. Sun, Q. Jiang, and C. Lu, “Recursive social behavior graph for trajectory prediction,” pp. 660–669, 2020.
  27. A. Alahi, K. Goel, V. Ramanathan, A. Robicquet, L. Fei-Fei, and S. Savarese, “Social lstm: Human trajectory prediction in crowded spaces,” pp. 961–971, 2016.
  28. T. Qin, P. Li, and S. Shen, “Vins-mono: A robust and versatile monocular visual-inertial state estimator,” IEEE Transactions on Robotics, vol. 34, no. 4, pp. 1004–1020, 2018.
  29. Z. Xu, Y. Xiu, X. Zhan, B. Chen, and K. Shimada, “Vision-aided uav navigation and dynamic obstacle avoidance using gradient-based b-spline trajectory optimization,” in 2023 IEEE International Conference on Robotics and Automation (ICRA), 2023, pp. 1214–1220.
Citations (14)

Summary

  • The paper presents a hybrid 3D mapping approach that fuses occupancy voxel maps with depth-image region proposals for precise obstacle detection.
  • The paper introduces a dual-filter method combining Kalman and continuity filters to reliably identify and track dynamic obstacles.
  • The paper demonstrates effective real-time trajectory prediction using a Markov chain method, achieving over 25Hz performance on UAV hardware.

Overview of Real-Time Dynamic Obstacle Tracking and Mapping for UAVs

This paper introduces a sophisticated system designed for real-time dynamic obstacle tracking and mapping to aid Unmanned Aerial Vehicles (UAVs) in navigating and avoiding collisions. The authors focus on utilizing RGB-D cameras to provide the quadcopters with enhanced perception capabilities, addressing constraints typically faced by UAVs, such as limited computational resources.

The proposed solution leverages a 3D hybrid map system that utilizes an occupancy voxel map for efficient static environment representation, alongside employing depth images to generate obstacle region proposals. This system effectively distinguishes between static and dynamic obstacles—a significant limitation in conventional voxel-based methods. Furthermore, it introduces a strategy for real-time obstacle tracking using a combination of Kalman and continuity filters, providing robust performance even with restricted onboard computations.

Key Contributions

  • Region Proposal Detector with Map Refinement: By employing a lightweight depth image-based detector, the system generates region proposals for potential obstacles that are immediately refined using static map data. This hybrid method ensures improved estimation of obstacle positioning and sizing.
  • Dynamic Obstacle Identification and Tracking: The system enables continuous tracking of dynamic obstacles by implementing both the Kalman filter for velocity estimation and a novel continuity filter to ensure accurate dynamic classification, reducing false positives from static objects.
  • Environment-Aware Trajectory Prediction: Utilizing a Markov chain-based method, the trajectory prediction module incorporates environment interaction, enhancing prediction accuracy concerning obstacle movement relative to static map data.

Simulation experiments showcase the system's efficiency in handling environments with static structures and multiple moving entities, demonstrating accurate real-time obstacle detection and trajectory prediction. Physical experiments further reinforce its capability to sustain performance when implemented on actual UAV systems in various test scenarios.

Experimental Findings

Quantitative analysis places the proposed system at an advantage compared to state-of-the-art methods, specifically in terms of dynamic obstacle position and velocity estimation. The track record of the UAV system using this approach reflected a reduced failure ratio in trajectory predictions, particularly in complex environments. Performance metrics affirm the method's capability to sustain over 25Hz operational frequency on an Nvidia Xavier NX, validating its suitability for real-time deployment.

Implications and Future Directions

The practical implications of implementing a robust system such as this are significant, allowing UAVs to operate effectively in dense or unpredictable environments typical in applications like urban navigation or search and rescue. The system's ability to utilize low-computation solutions means its application can extend to a range of UAV models, enhancing scalability and adaptability.

The authors note the potential for exploring camera models with broader fields of view to further improve tracking accuracy in dynamic environments. This could unlock new dimensions in UAV autonomy, allowing for more seamless interaction within varied operational spaces.

In conclusion, this paper presents a cogent and methodological advancement in UAV dynamic obstacle mapping and tracking, which could serve as a foundation for enhancing autonomy in future aerial robotics applications. The balance between computational efficiency and robust dynamic environment interaction marks a noteworthy step forward in UAV capabilities.

Youtube Logo Streamline Icon: https://streamlinehq.com