Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Indoor Localization for an Autonomous Model Car: A Marker-Based Multi-Sensor Fusion Framework (2310.05198v1)

Published 8 Oct 2023 in cs.RO

Abstract: Global navigation satellite systems readily provide accurate position information when localizing a robot outdoors. However, an analogous standard solution does not exist yet for mobile robots operating indoors. This paper presents an integrated framework for indoor localization and experimental validation of an autonomous driving system based on an advanced driver-assistance system (ADAS) model car. The global pose of the model car is obtained by fusing information from fiducial markers, inertial sensors and wheel odometry. In order to achieve robust localization, we investigate and compare two extensions to the Extended Kalman Filter; first with adaptive noise tuning and second with Chi-squared test for measurement outlier detection. An efficient and low-cost ground truth measurement method using a single LiDAR sensor is also proposed to validate the results. The performance of the localization algorithms is tested on a complete autonomous driving system with trajectory planning and model predictive control.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (29)
  1. X. Zhong, Y. Zhou, and H. Liu, “Design and recognition of artificial landmarks for reliable indoor self-localization of mobile robots,” International Journal of Advanced Robotic Systems, vol. 14, no. 1, p. 1729881417693489, 2017.
  2. C. Campos, R. Elvira, J. J. G. Rodríguez, J. M. M. Montiel, and J. D. Tardós, “ORB-SLAM3: An accurate open-source library for visual, visual–inertial, and multimap SLAM,” IEEE Transactions on Robotics, vol. 37, no. 6, pp. 1874–1890, 2021.
  3. S. Kohlbrecher, O. von Stryk, J. Meyer, and U. Klingauf, “A flexible and scalable SLAM system with full 3D motion estimation,” in 2011 IEEE International Symposium on Safety, Security, and Rescue Robotics, 2011, pp. 155–160.
  4. S. Garrido-Jurado, R. Muñoz-Salinas, F. J. Madrid-Cuevas, and M. J. Marín-Jiménez, “Automatic generation and detection of highly reliable fiducial markers under occlusion,” Pattern Recognition, vol. 47, no. 6, pp. 2280–2292, 2014.
  5. E. Olson, “AprilTag: A robust and flexible visual fiducial system,” in 2011 IEEE International Conference on Robotics and Automation, 2011, pp. 3400–3407.
  6. B.-S. Cho, W.-s. Moon, W.-J. Seo, and K.-R. Baek, “A dead reckoning localization system for mobile robots using inertial sensors and wheel revolution encoding,” Journal of Mechanical Science and Technology, vol. 25, no. 11, pp. 2907–2917, Nov 2011.
  7. P. Nazemzadeh, D. Fontanelli, D. Macii, and L. Palopoli, “Indoor localization of mobile robots through QR code detection and dead reckoning data fusion,” IEEE/ASME Transactions on Mechatronics, vol. 22, no. 6, pp. 2588–2599, 2017.
  8. N. Kayhani, A. Heins, W. Zhao, M. Nahangi, B. McCabe, and A. P. Schoelligb, “Improved tag-based indoor localization of UAVs using extended Kalman filter,” in Proceedings of the ISARC. International Symposium on Automation and Robotics in Construction, Banff, AB, Canada, 2019, pp. 21–24.
  9. A. Skobeleva, V. Ugrinovskii, and I. Petersen, “Extended Kalman filter for indoor and outdoor localization of a wheeled mobile robot,” in 2016 Australian Control Conference (AuCC), 2016, pp. 212–216.
  10. T. L. Song and J. L. Speyer, “The modified gain extended Kalman filter and parameter identification in linear systems,” Automatica, vol. 22, no. 1, pp. 59–75, 1986.
  11. S. Akhlaghi, N. Zhou, and Z. Huang, “Adaptive adjustment of noise covariance in Kalman filter for dynamic state estimation,” in 2017 IEEE Power and Energy Society General Meeting, 2017.
  12. B. Brumback and M. Srinath, “A Chi-square test for fault-detection in Kalman filters,” IEEE Transactions on Automatic Control, vol. 32, no. 6, pp. 552–554, 1987.
  13. J. Chen, C. Sun, and A. Zhang, “Autonomous navigation for adaptive unmanned underwater vehicles using fiducial markers,” in 2021 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2021, pp. 9298–9304.
  14. R. Mur-Artal, J. M. M. Montiel, and J. D. Tardós, “ORB-SLAM: A versatile and accurate monocular SLAM system,” IEEE Transactions on Robotics, vol. 31, no. 5, pp. 1147–1163, 2015.
  15. R. Mur-Artal and J. D. Tardós, “ORB-SLAM2: An open-source SLAM system for monocular, stereo, and RGB-D cameras,” IEEE Transactions on Robotics, vol. 33, no. 5, pp. 1255–1262, 2017.
  16. R. Muñoz-Salinas and R. Medina-Carnicer, “UcoSLAM: Simultaneous localization and mapping by fusion of keypoints and squared planar markers,” Pattern Recognition, vol. 101, p. 107193, 2020.
  17. R. Muñoz-Salinas, M. J. Marín-Jimenez, and R. Medina-Carnicer, “SPM-SLAM: Simultaneous localization and mapping with squared planar markers,” Pattern Recognition, vol. 86, pp. 156–171, 2019.
  18. Z. Fang, Y. Chen, M. Zhou, and C. Lu, “Marker-based mapping and localization for autonomous valet parking,” 2020.
  19. Digitalwerk, “Digitalwerk ADAS Model Car.” [Online]. Available: https://www.digitalwerk.net/adas-modellauto/
  20. C. Berger, “From a competition for self-driving miniature cars to a standardized experimental platform,” in Concept, Models, Architecture, and Evaluation,” Journal of Software Engineering for Robotics.   Citeseer, 2014.
  21. A. Folkers, M. Rick, and C. Büskens, “Time-dependent hybrid-state A*{}^{\negthinspace*}start_FLOATSUPERSCRIPT * end_FLOATSUPERSCRIPT and optimal control for autonomous vehicles in arbitrary and dynamic environments,” IFAC-PapersOnLine, vol. 53, no. 2, pp. 15 077–15 083, 2020.
  22. M. Rick, J. Clemens, L. Sommer, A. Folkers, K. Schill, and C. Büskens, “Autonomous driving based on nonlinear model predictive control and multi-sensor fusion,” IFAC-PapersOnLine, vol. 52, no. 8, pp. 182–187, 2019.
  23. L. Piardi, J. L. Lima, and P. G. Costa, “Development of a ground truth localization system for wheeled mobile robots in indoor environments based on laser range-finder for low-cost systems,” in ICINCO, 2018.
  24. S. Ceriani, G. Fontana, A. Giusti, D. Marzorati, M. Matteucci, D. Migliore, D. Rizzi, D. G. Sorrenti, and P. Taddei, “Rawseeds ground truth collection systems for indoor self-localization and mapping,” Autonomous Robots, vol. 27, no. 4, p. 353, Sep 2009.
  25. Velodyne, “VLP-16 User Manual,” Velodyne LiDAR, Inc., 2019. [Online]. Available: https://velodynelidar.com/wp-content/uploads/2019/12/63-9243-Rev-E-VLP-16-User-Manual.pdf
  26. OpenCV, “Calibration with ArUco and ChArUco.” [Online]. Available: https://docs.opencv.org/3.4/da/d13/tutorial˙aruco˙calibration.html
  27. ——, “Perspective-n-Point (PnP) pose computation.” [Online]. Available: https://docs.opencv.org/4.x/d5/d1f/calib3d˙solvePnP.html
  28. V. Lepetit, F. Moreno-Noguer, and P. Fua, “EPnP: An accurate O(n) solution to the PnP problem,” International journal of computer vision, vol. 81, no. 2, pp. 155–166, 2009.
  29. A. H. Mohamed and K. P. Schwarz, “Adaptive Kalman filtering for INS/GPS,” Journal of Geodesy, vol. 73, pp. 193–203, 1999.
Citations (2)

Summary

We haven't generated a summary for this paper yet.