Papers
Topics
Authors
Recent
Search
2000 character limit reached

Active propulsion noise shaping for multi-rotor aircraft localization

Published 27 Feb 2024 in cs.RO and cs.AI | (2402.17289v2)

Abstract: Multi-rotor aerial autonomous vehicles (MAVs) primarily rely on vision for navigation purposes. However, visual localization and odometry techniques suffer from poor performance in low or direct sunlight, a limited field of view, and vulnerability to occlusions. Acoustic sensing can serve as a complementary or even alternative modality for vision in many situations, and it also has the added benefits of lower system cost and energy footprint, which is especially important for micro aircraft. This paper proposes actively controlling and shaping the aircraft propulsion noise generated by the rotors to benefit localization tasks, rather than considering it a harmful nuisance. We present a neural network architecture for selfnoise-based localization in a known environment. We show that training it simultaneously with learning time-varying rotor phase modulation achieves accurate and robust localization. The proposed methods are evaluated using a computationally affordable simulation of MAV rotor noise in 2D acoustic environments that is fitted to real recordings of rotor pressure fields.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (18)
  1. A. Couturier and M. A. Akhloufi, “A review on absolute visual localization for uav,” Robotics and Autonomous Systems, vol. 135, p. 103666, 2021.
  2. F. Khattar, F. Luthon, B. Larroque, and F. Dornaika, “Visual localization and servoing for drone use in indoor remote laboratory environment,” Machine Vision and Applications, vol. 32, no. 1, p. 32, 2021.
  3. S. Krul, C. Pantos, M. Frangulea, and J. Valente, “Visual slam for indoor livestock and farming using a small drone with a monocular camera: A feasibility study,” Drones, vol. 5, no. 2, p. 41, 2021.
  4. J. Skoda and R. Barták, “Camera-based localization and stabilization of a flying drone,” in The Twenty-Eighth International Flairs Conference.   Citeseer, 2015.
  5. A. Antonopoulos, M. G. Lagoudakis, and P. Partsinevelos, “A ros multi-tier uav localization module based on gnss, inertial and visual-depth data,” Drones, vol. 6, no. 6, p. 135, 2022.
  6. X. Fan, D. Lee, Y. Chen, C. Prepscius, V. Isler, L. Jackel, H. S. Seung, and D. Lee, “Acoustic collision detection and localization for robot manipulators,” in 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2020, pp. 9529–9536.
  7. J.-S. Hu, C.-Y. Chan, C.-K. Wang, M.-T. Lee, and C.-Y. Kuo, “Simultaneous localization of a mobile robot and multiple sound sources using a microphone array,” Advanced Robotics, vol. 25, no. 1-2, pp. 135–152, 2011.
  8. T. Zhang, H. Zhang, X. Li, J. Chen, T. L. Lam, and S. Vijayakumar, “Acousticfusion: Fusing sound source localization to visual slam in dynamic environments,” in 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2021, pp. 6868–6875.
  9. I. Eliakim, Z. Cohen, G. Kosa, and Y. Yovel, “A fully autonomous terrestrial bat-like acoustic robot,” PLoS computational biology, vol. 14, no. 9, p. e1006406, 2018.
  10. M. D. Baxendale, M. J. Pearson, M. Nibouche, E. L. Secco, and A. G. Pipe, “Audio localization for robots using parallel cerebellar models,” IEEE Robotics and automation letters, vol. 3, no. 4, pp. 3185–3192, 2018.
  11. T. G. Kim and N. Y. Ko, “Localization of an underwater robot using acoustic signal,” The Journal of Korea Robotics Society.
  12. E. Vargas, R. Scona, J. S. Willners, T. Luczynski, Y. Cao, S. Wang, and Y. R. Petillot, “Robust underwater visual slam fusing acoustic sensing,” in 2021 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2021, pp. 2140–2146.
  13. M. Franchi, A. Ridolfi, L. Zacchini, and B. Allotta, “Experimental evaluation of a forward-looking sonar-based system for acoustic odometry,” in OCEANS 2019-Marseille.   IEEE, 2019, pp. 1–6.
  14. A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, L. Kaiser, and I. Polosukhin, “Attention is all you need,” 2023.
  15. T. Shor, “Multi pilot: Feasible learned multiple acquisition trajectories for dynamic mri,” in Medical Imaging with Deep Learning, 2023.
  16. E. Weiszfeld, “Sur le point pour lequel la somme des distances de n points donnes est minimum.” Tohoku Mathematics Journal 43, p. 355–386, 1937.
  17. M. Strauss, P. Mordel, V. Miguet, and A. Deleforge, “Dregon: Dataset and methods for uav-embedded sound source localization,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2018, pp. 1–8.
  18. D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” in 3rd International Conference on Learning Representations, ICLR 2015.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.