Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
140 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Synergistic Perception and Control Simplex for Verifiable Safe Vertical Landing (2312.02937v1)

Published 5 Dec 2023 in cs.RO, cs.SY, and eess.SY

Abstract: Perception, Planning, and Control form the essential components of autonomy in advanced air mobility. This work advances the holistic integration of these components to enhance the performance and robustness of the complete cyber-physical system. We adapt Perception Simplex, a system for verifiable collision avoidance amidst obstacle detection faults, to the vertical landing maneuver for autonomous air mobility vehicles. We improve upon this system by replacing static assumptions of control capabilities with dynamic confirmation, i.e., real-time confirmation of control limitations of the system, ensuring reliable fulfiLLMent of safety maneuvers and overrides, without dependence on overly pessimistic assumptions. Parameters defining control system capabilities and limitations, e.g., maximum deceleration, are continuously tracked within the system and used to make safety-critical decisions. We apply these techniques to propose a verifiable collision avoidance solution for autonomous aerial mobility vehicles operating in cluttered and potentially unsafe environments.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (56)
  1. Heaven, D., et al., “Why deep-learning AIs are so easy to fool,” Nature, Vol. 574, No. 7777, 2019, pp. 163–166. 10.1038/d41586-019-03013-5.
  2. Huang, X., et al., “A survey of safety and trustworthiness of deep neural networks: Verification, testing, adversarial attack and defence, and interpretability,” Computer Science Review, Vol. 37, 2020, p. 100270.
  3. Bansal, A., Kim, H., Yu, S., Li, B., Hovakimyan, N., Caccamo, M., and Sha, L., “Verifiable Obstacle Detection,” 2022 IEEE 33rd International Symposium on Software Reliability Engineering (ISSRE), IEEE, 2022a, pp. 61–72.
  4. Bansal, A., Yu, S., Kim, H., Li, B., Hovakimyan, N., Caccamo, M., and Sha, L., “Synergistic Redundancy: Towards Verifiable Safety for Autonomous Vehicles,” arXiv preprint arXiv:2209.01710, 2022b.
  5. Mao, Y., Gu, Y., Hovakimyan, N., Sha, L., and Voulgaris, P., “SL1-Simplex: Safe Velocity Regulation of Self-Driving Vehicles in Dynamic and Unforeseen Environments,” ACM Transactions on Cyber-Physical Systems, Vol. 7, No. 1, 2023, pp. 1–24.
  6. McIntosh, K., and Mishra, S., “Transition trajectory planning and control for quadrotor biplanes in obstacle cluttered environments,” Vertical Flight Society 78th Annual Forum Proceedings, 2022.
  7. Liao, F., Hu, Y., Cui, J., Tang, Y., Lao, M., Lin, F., Teo, R., Lai, S., and Wang, J., “Motion planning of UAV platooning in unknown cluttered environment,” 2017 11th Asian Control Conference (ASCC), IEEE, 2017, pp. 1707–1712.
  8. Moon, J., Lee, B.-Y., and Tahk, M.-J., “A hybrid dynamic window approach for collision avoidance of VTOL UAVs,” International Journal of Aeronautical and Space Sciences, Vol. 19, 2018, pp. 889–903.
  9. Sandino, J., Vanegas, F., Maire, F., Caccetta, P., Sanderson, C., and Gonzalez, F., “UAV framework for autonomous onboard navigation and people/object detection in cluttered indoor environments,” Remote Sensing, Vol. 12, No. 20, 2020, p. 3386.
  10. Patrona, F., Nousi, P., Mademlis, I., Tefas, A., and Pitas, I., “Visual object detection for autonomous UAV cinematography,” Proceedings of the northern lights deep learning workshop, Vol. 1, 2020, pp. 6–6.
  11. Sonkar, S., Kumar, P., George, R. C., Yuvaraj, T., Philip, D., and Ghosh, A., “Real-time object detection and recognition using fixed-wing Lale VTOL UAV,” IEEE Sensors Journal, Vol. 22, No. 21, 2022, pp. 20738–20747.
  12. Rosa, L., Hamel, T., Mahony, R., and Samson, C., “Optical-flow based strategies for landing vtol uavs in cluttered environments,” IFAC Proceedings Volumes, Vol. 47, No. 3, 2014, pp. 3176–3183.
  13. Bansal, A., Kim, H., Yu, S., Li, B., Hovakimyan, N., Caccamo, M., and Sha, L., “Perception Simplex: Verifiable Collision Avoidance in Autonomous Vehicles Amidst Obstacle Detection Faults,” Tech. rep., 2023. URL https://ayooshbansal.com/assets/pdf/Perception_Simplex.pdf.
  14. Sha, L., et al., “Using simplicity to control complexity,” IEEE Software, Vol. 18, No. 4, 2001, pp. 20–28.
  15. Wang, X., Hovakimyan, N., and Sha, L., “L1Simplex: Fault-tolerant control of cyber-physical systems,” Proceedings of the ACM/IEEE 4th International Conference on Cyber-Physical Systems, 2013, pp. 41–50.
  16. Altman, N., Weinstock, C., Sha, L., and Seto, D., “Simplex TM in a Hostile Communications Environment: The Coordinated Prototype,” Tech. rep., Citeseer, 1999.
  17. Johnson, W., Silva, C., and Solis, E., “Concept vehicles for VTOL air taxi operations,” AHS Specialists”Conference on Aeromechanics Design for Transformative Vertical Flight, 2018.
  18. Pascioni, K. A., Watts, M. E., Houston, M., Lind, A., Stephenson, J. H., and Bain, J., “Acoustic Flight Test of the Joby Aviation Advanced Air Mobility Prototype Vehicle,” 28th AIAA/CEAS Aeroacoustics 2022 Conference, 2022, p. 3036.
  19. Gipson, L., “Revolutionary Vertical Lift Technology Project Overview,” https://www.nasa.gov/aeroresearch/programs/aavp/rvlt/description/, 2021. Accessed: 2023-05-15.
  20. Lee, D., Ryan, T., and Kim, H. J., “Autonomous landing of a VTOL UAV on a moving platform using image-based visual servoing,” 2012 IEEE international conference on robotics and automation, IEEE, 2012, pp. 971–976.
  21. Ferrão, I. G., Espes, D., Dezan, C., and Branco, K. R. L. J. C., “Security and safety concerns in air taxis: a systematic literature review,” Sensors, Vol. 22, No. 18, 2022, p. 6875.
  22. Chen, C., Wang, Z., Gong, Z., Cai, P., Zhang, C., and Li, Y., “Autonomous Navigation and Obstacle Avoidance for Small VTOL UAV in Unknown Environments,” Symmetry, Vol. 14, No. 12, 2022, p. 2608.
  23. Coffey, V. C., “High-Tech Lidar: The Future Looks Fly,” https://www.photonics.com/Articles/High-Tech_Lidar_The_Future_Looks_Fly/a65101, 2019. Accessed: 2023-05-18.
  24. Zhang, J., and Singh, S., “LOAM: Lidar odometry and mapping in real-time.” Robotics: Science and Systems, Vol. 2, Berkeley, CA, 2014, pp. 1–9.
  25. Ramasamy, S., Sabatini, R., Gardi, A., and Liu, J., “LIDAR obstacle warning and avoidance system for unmanned aerial vehicle sense-and-avoid,” Aerospace Science and Technology, Vol. 55, 2016, pp. 344–358.
  26. Feiler, P., Goodenough, J., Gurfinkel, A., Weinstock, C., and Wrage, L., “Four pillars for improving the quality of safety-critical software-reliant systems,” Tech. rep., Carnegie-Mellon Univ Pittsburg PA Software Engineering Inst, 2013.
  27. Heimdahl, M., Leveson, N., Redler, J., Felton, M., and Lee, G., “Software Assurance Approaches, Considerations, and Limitations: Final Report,” Tech. rep., Technical Report DOT/FAA/TC-15/57. Federal Aviation Administration, US DOT, 2016.
  28. Pereira, A., and Thomas, C., “Challenges of machine learning applied to safety-critical cyber-physical systems,” Machine Learning and Knowledge Extraction, Vol. 2, No. 4, 2020, pp. 579–602. https://doi.org/10.3390/make2040031.
  29. Willers, O., Sudholt, S., Raafatnia, S., and Abrecht, S., “Safety concerns and mitigation approaches regarding the use of deep learning in safety-critical perception tasks,” International Conference on Computer Safety, Reliability, and Security, Springer, 2020, pp. 336–350. https://doi.org/10.1007/978-3-030-55583-2_25.
  30. Tambon, F., Laberge, G., An, L., Nikanjam, A., Mindom, P. S. N., Pequignot, Y., Khomh, F., Antoniol, G., Merlo, E., and Laviolette, F., “How to certify machine learning based safety-critical systems? A systematic literature review,” Automated Software Engineering, Vol. 29, No. 2, 2022, p. 38. https://doi.org/10.1007/s10515-022-00337-x.
  31. Bogoslavskyi, I., and Stachniss, C., “Efficient online segmentation for sparse 3D laser scans,” PFG–Journal of Photogrammetry, Remote Sensing and Geoinformation Science, Vol. 85, No. 1, 2017, pp. 41–52.
  32. Bansal, A., Singh, J., Verucchi, M., Caccamo, M., and Sha, L., “Risk ranked recall: Collision safety metric for object detection systems in autonomous vehicles,” 2021 10th Mediterranean Conference on Embedded Computing (MECO), IEEE, 2021, pp. 1–4.
  33. Schmidt, C., Oechsle, F., and Branz, W., “Research on trajectory planning in emergency situations with multiple objects,” 2006 IEEE Intelligent Transportation Systems Conference, IEEE, 2006, pp. 988–992. 10.1109/ITSC.2006.1707153.
  34. Sun, S., Romero, A., Foehn, P., Kaufmann, E., and Scaramuzza, D., “A comparative study of nonlinear MPC and differential-flatness-based control for quadrotor agile flight,” arXiv preprint arXiv:2109.01365, 2021.
  35. Tal, E., and Karaman, S., “Accurate tracking of aggressive quadrotor trajectories using incremental nonlinear dynamic inversion and differential flatness,” IEEE Transactions on Control Systems Technology, Vol. 29, No. 3, 2020, pp. 1203–1218.
  36. Wu, Z., Cheng, S., Ackerman, K. A., Gahlawat, A., Lakshmanan, A., Zhao, P., and Hovakimyan, N., “ℒ1subscriptℒ1\mathcal{L}_{1}caligraphic_L start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT Adaptive Augmentation for Geometric Tracking Control of Quadrotors,” Proceedings of the International Conference on Robotics and Automation, Philadelphia, PA, USA, 2022, pp. 1329–1336.
  37. Wu, Z., Cheng, S., Zhao, P., Gahlawat, A., Ackerman, K. A., Lakshmanan, A., Yang, C., Yu, J., and Hovakimyan, N., “ℒ1subscriptℒ1\mathcal{L}_{1}caligraphic_L start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPTQuad: ℒ1subscriptℒ1\mathcal{L}_{1}caligraphic_L start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT Adaptive Augmentation of Geometric Control for Agile Quadrotors with Performance Guarantees,” arXiv preprint arXiv:2302.07208, 2023.
  38. Gregory, I., Cao, C., Xargay, E., Hovakimyan, N., and Zou, X., “ℒ1subscriptℒ1\mathcal{L}_{1}caligraphic_L start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT adaptive control design for NASA AirSTAR flight test vehicle,” Proceedings of AIAA Guidance, Navigation, and Control Conference, Chicago, IL, USA, 2009, p. 5738.
  39. Gregory, I., Xargay, E., Cao, C., and Hovakimyan, N., “Flight test of an ℒ1subscriptℒ1\mathcal{L}_{1}caligraphic_L start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT adaptive controller on the NASA AirSTAR flight test vehicle,” Proceedings of AIAA Guidance, Navigation, and Control Conference, Toronto, Ontario, Canada, 2010, p. 8015.
  40. Ackerman, K., Xargay, E., Choe, R., Hovakimyan, N., Cotting, M. C., Jeffrey, R. B., Blackstun, M. P., Fulkerson, T. P., Lau, T. R., and Stephens, S. S., “ℒ1subscriptℒ1\mathcal{L}_{1}caligraphic_L start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT stability augmentation system for Calspan’s variable-stability Learjet,” Proceedings of AIAA Guidance, Navigation, and Control Conference, San Diego, CA, USA, 2016, p. 0631.
  41. Ackerman, K. A., Xargay, E., Choe, R., Hovakimyan, N., Cotting, M. C., Jeffrey, R. B., Blackstun, M. P., Fulkerson, T. P., Lau, T. R., and Stephens, S. S., “Evaluation of an ℒ1subscriptℒ1\mathcal{L}_{1}caligraphic_L start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT Adaptive Flight Control Law on Calspan’s Variable-Stability Learjet,” AIAA Journal of Guidance, Control, and Dynamics, Vol. 40, No. 4, 2017, pp. 1051–1060.
  42. Cook, J. W., and Gregory, I. M., “A Robust Uniform Control Approach for VTOL Aircraft,” 2021 Autonomous VTOL Technical Meeting and Electric VTOL Symposium, 2021.
  43. Acheson, M. J., Gregory, I. M., and Cook, J., “Examination of unified control incorporating generalized control allocation,” AIAA Scitech 2021 Forum, 2021, p. 0999.
  44. Seto, D., Krogh, B. H., Sha, L., and Chutinan, A., “Dynamic control system upgrade using the simplex architecture,” IEEE Control Systems Magazine, Vol. 18, No. 4, 1998, pp. 72–80.
  45. Yoon, H.-J., Jafarnejadsani, H., and Voulgaris, P., “Learning When to Use Adaptive Adversarial Image Perturbations against Autonomous Vehicles,” IEEE Robotics and Automation Letters, 2023.
  46. Dosovitskiy, A., Ros, G., Codevilla, F., Lopez, A., and Koltun, V., “CARLA: An open urban driving simulator,” Conference on robot learning, PMLR, 2017, pp. 1–16.
  47. Quigley, M., Conley, K., Gerkey, B., Faust, J., Foote, T., Leibs, J., Wheeler, R., Ng, A. Y., et al., “ROS: an open-source Robot Operating System,” ICRA workshop on open source software, Vol. 3, Kobe, Japan, 2009, p. 5.
  48. Ng, H. K., “Collaborative Weather Research and Development for Urban Air Mobility,” FPAW 2022 Spring Meeting, 2022. URL https://ntrs.nasa.gov/api/citations/20220005500/downloads/20220005500_HokKNg_FPAW2022_presentation_final.pdf.
  49. Aláez, D., Olaz, X., Prieto, M., Villadangos, J., and Astrain, J., “VTOL UAV digital twin for take-off, hovering and landing in different wind conditions,” Simulation Modelling Practice and Theory, Vol. 123, 2023, p. 102703. 10.1016/j.simpat.2022.102703.
  50. Beiderman, A., Darmstadt, P. R., Dillard, C., and Silva, C., “Hazard analysis failure modes, effects, and criticality analysis for NASA revolutionary vertical lift technology concept vehicles,” Vertical Flight Society 77th Annual Forum, 2021. URL https://rotorcraft.arc.nasa.gov/Publications/files/77-2021-0287_Beiderman.pdf.
  51. Tu, J., Ren, M., Manivasagam, S., Liang, M., Yang, B., Du, R., Cheng, F., and Urtasun, R., “Physically realizable adversarial examples for lidar object detection,” Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2020, pp. 13716–13725. 10.48550/arXiv.2004.00543.
  52. Miller, D., Moghadam, P., Cox, M., Wildie, M., and Jurdak, R., “What’s in the black box? the false negative mechanisms inside object detectors,” IEEE Robotics and Automation Letters, Vol. 7, No. 3, 2022, pp. 8510–8517. 10.1109/LRA.2022.3187831.
  53. Ventura Diaz, P., and Yoon, S., “Computational study of NASA’S quadrotor urban air taxi concept,” AIAA SciTech 2020 Forum, 2020, p. 0302.
  54. Katz, G., Barrett, C., Dill, D. L., Julian, K., and Kochenderfer, M. J., “Reluplex: An efficient SMT solver for verifying deep neural networks,” International Conference on Computer Aided Verification, Springer, 2017, pp. 97–117. https://doi.org/10.1007/978-3-319-63387-9_5.
  55. Gharib, M., Lollini, P., Botta, M., Amparore, E., Donatelli, S., and Bondavalli, A., “On the safety of automotive systems incorporating machine learning based components: a position paper,” 2018 48th Annual IEEE/IFIP International Conference on Dependable Systems and Networks Workshops (DSN-W), IEEE, 2018, pp. 271–274. 10.1109/DSN-W.2018.00074.
  56. Liu, C., Arnon, T., Lazarus, C., Strong, C., Barrett, C., Kochenderfer, M. J., et al., “Algorithms for verifying deep neural networks,” Foundations and Trends® in Optimization, Vol. 4, No. 3-4, 2021, pp. 244–404. 10.1561/2400000035.
Citations (1)

Summary

  • The paper presents a novel verifiable collision avoidance system for VTOL air taxis that adapts dynamically to real-time conditions.
  • It integrates a synergistic perception and control pipeline to reduce landing time while replacing static control assumptions with dynamic confirmations.
  • Software-in-the-loop simulations in urban environments validate the system, significantly improving both safety and landing efficiency.

Introduction to the Safety-Critical Mechanism for Autonomous Aerial Vehicles

The transition towards autonomous aerial mobility, particularly focusing on VTOL (Vertical Takeoff and Landing) air taxis, presents unique challenges. Fundamental among these challenges is the safe and efficient completion of vertical landing maneuvers, which must be accomplished in varied and potentially cluttered environments. The work asserts that ensuring the safety of landing operations requires a system capable of reliably detecting obstacles and performing necessary avoidance actions.

The Role of Synergistic Perception and Control Simplex

Synergistic Perception and Control Simplex plays a critical role in the operational safety of autonomous air taxis. The system integrates a verifiable obstacle detection algorithm and a high-assurance control strategy into the control pipeline of the vehicle. The cornerstone of the approach lies in replacing static assumptions about control capabilities (such as maximum deceleration) with dynamic confirmations. This not only enhances safety by adapting to real-time conditions but also avoids the limitations imposed by overly conservative assumptions that could otherwise diminish operational efficiency.

Contributions of the Study

The primary contributions of this work include a verifiable collision avoidance solution specifically tailored for VTOL urban air mobility vehicles. The work underlines novel interactions between different components of the fail-safe mechanism, which lead to substantial reductions in landing time while upholding safety guarantees. The research pivots around adapting and improving existing solutions used in ground vehicles, extending their applicability to the domain of autonomous air taxis.

Evaluation and Findings

The model has been evaluated using a software-in-the-loop simulation setup. The safety layer's components were integrated within an extendable simulation setup that bridges a high-fidelity air taxi model with an urban environment simulator. Scenarios with various obstacle configurations were tested to assess the safety and performance aspects of the system. The findings confirm that the proposed system successfully avoids collisions and significantly enhances landing efficiency compared to methodologies that rely on static assumptions.

Looking Ahead

This paper sets the stage for further advancements by pointing to future work that would focus on proactive adaptation to anticipated environmental changes and integrated control within Perception and Control Simplex systems. Additionally, the principles and methods developed in this work have broader applications and could extend to other domains where safety-critical operations are paramount.

Overall, the synergistic perception and control solution presented in this paper marks a pivotal step towards the realization of safe and efficient autonomous aerial mobility, particularly for VTOL air taxis. This work marries the rigor of verifiable safety measures with the agility required for practical, real-world applications, thereby contributing significantly to the advancement of autonomous urban air transportation.