Papers
Topics
Authors
Recent
AI Research Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 74 tok/s
Gemini 2.5 Pro 46 tok/s Pro
GPT-5 Medium 13 tok/s Pro
GPT-5 High 20 tok/s Pro
GPT-4o 87 tok/s Pro
Kimi K2 98 tok/s Pro
GPT OSS 120B 464 tok/s Pro
Claude Sonnet 4 40 tok/s Pro
2000 character limit reached

Detecting and Classifying Bio-Inspired Artificial Landmarks Using In-Air 3D Sonar (2308.05504v3)

Published 10 Aug 2023 in cs.RO and eess.SP

Abstract: Various autonomous applications rely on recognizing specific known landmarks in their environment. For example, Simultaneous Localization And Mapping (SLAM) is an important technique that lays the foundation for many common tasks, such as navigation and long-term object tracking. This entails building a map on the go based on sensory inputs which are prone to accumulating errors. Recognizing landmarks in the environment plays a vital role in correcting these errors and further improving the accuracy of SLAM. The most popular choice of sensors for conducting SLAM today is optical sensors such as cameras or LiDAR sensors. These can use landmarks such as QR codes as a prerequisite. However, such sensors become unreliable in certain conditions, e.g., foggy, dusty, reflective, or glass-rich environments. Sonar has proven to be a viable alternative to manage such situations better. However, acoustic sensors also require a different type of landmark. In this paper, we put forward a method to detect the presence of bio-mimetic acoustic landmarks using support vector machines trained on the frequency bands of the reflecting acoustic echoes using an embedded real-time imaging sonar.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (18)
  1. O. Esrafilian and H. D. Taghirad, “Autonomous flight and obstacle avoidance of a quadrotor by monocular slam,” 4th RSI International Conference on Robotics and Mechatronics, ICRoM 2016, pp. 240–245, 3 2017.
  2. B. Bescos, C. Campos, J. D. Tardos, and J. Neira, “Dynaslam ii: Tightly-coupled multi-object tracking and slam,” IEEE Robotics and Automation Letters, vol. 6, pp. 5191–5198, 7 2021.
  3. C. Debeunne and D. Vivet, “A review of visual-lidar fusion based simultaneous localization and mapping,” Sensors 2020, Vol. 20, Page 2068, vol. 20, p. 2068, 4 2020.
  4. J. Steckel and H. Peremans, “Batslam: Simultaneous localization and mapping using biomimetic sonar,” PLOS ONE, vol. 8, no. 1, pp. 1–11, 01 2013.
  5. J. Benesty, J. Chen, Y. Huang, and J. Dmochowski, “On microphone-array beamforming from a mimo acoustic signal processing perspective,” IEEE Transactions on Audio, Speech, and Language Processing, vol. 15, no. 3, pp. 1053–1065, 2007.
  6. U. Michel, “History of acoustic beamforming,” in 1st. Berlin Beamforming Conference, 2006.
  7. M. Sefati, M. Daum, B. Sondermann, K. D. Kreisköther, and A. Kampker, “Improving vehicle localization using semantic and pole-like landmarks,” in 2017 IEEE Intelligent Vehicles Symposium (IV), 2017, pp. 13–19.
  8. G. Grisetti, R. Kummerle, C. Stachniss, and W. Burgard, “A tutorial on graph-based slam,” IEEE Intelligent Transportation Systems Magazine, vol. 2, pp. 31–43, 12 2010.
  9. M. Kalaitzakis, B. Cain, S. Carroll, A. Ambrosi, C. Whitehead, and N. Vitzilaios, “Fiducial markers for pose estimation: Overview, applications and experimental comparison of the artag, apriltag, aruco and stag markers,” Journal of Intelligent & Robotic Systems, vol. 101, pp. 1–26, 2021.
  10. R. Simon, K. Bakunowski, A. E. Reyes-Vasques, M. Tschapka, M. Knörnschild, J. Steckel, and D. Stowell, “Acoustic traits of bat-pollinated flowers compared to flowers of other pollination syndromes and their echo-based classification using convolutional neural networks,” PLOS Computational Biology, vol. 17, p. e1009706, 12 2021.
  11. R. Simon, M. Knörnschild, M. Tschapka, A. Schneider, N. Passauer, E. K. Kalko, and O. von Helversen, “Biosonar resolving power: Echo-acoustic perception of surface structures in the submillimeter range,” Frontiers in Physiology, vol. 5 FEB, p. 64, 2 2014.
  12. R. Simon, S. Rupitsch, M. Baumann, H. Wu, H. Peremans, and J. Steckel, “Bioinspired sonar reflectors as guiding beacons for autonomous navigation,” Proceedings of the National Academy of Sciences of the United States of America, vol. 117, pp. 1367–1374, 1 2020.
  13. J. Steckel, “3dsonar.eu,” 2020. [Online]. Available: www.3dsonar.eu
  14. R. Kerstens, D. Laurijssen, and J. Steckel, “Ertis: A fully embedded real time 3d imaging sonar sensor for robotic applications,” Proceedings - IEEE International Conference on Robotics and Automation, vol. 2019-May, pp. 1438–1443, 5 2019.
  15. W. Jansen, D. Laurijssen, R. Kerstens, W. Daems, and J. Steckel, “In-Air Imaging Sonar Sensor Network with Real-Time Processing Using GPUs,” in 3PGCIC 2019: Advances on P2P, Parallel, Grid, Cloud and Internet Computing, vol. 96.   Springer, 2020, pp. 716–725.
  16. L. Chen, S. Gündüz, and M. T. Özsu, “Mixed type audio classification with support vector machine,” 2006 IEEE International Conference on Multimedia and Expo, ICME 2006 - Proceedings, vol. 2006, pp. 781–784, 2006.
  17. P. K. Kroh, R. Simon, and S. J. Rupitsch, “Classification of sonar targets in air: A neural network approach,” Sensors, vol. 19, no. 5, 2019.
  18. M. Dmitrieva, M. Valdenegro-Toro, K. Brown, G. Heald, and D. Lane, “Object classification with convolution neural network based on the time-frequency representation of their echo,” in 2017 IEEE 27th International Workshop on Machine Learning for Signal Processing (MLSP), 2017, pp. 1–6.
Citations (3)

Summary

We haven't generated a summary for this paper yet.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.