Towards Over-Canopy Autonomous Navigation: Crop-Agnostic LiDAR-Based Crop-Row Detection in Arable Fields (2403.17774v3)
Abstract: Autonomous navigation is crucial for various robotics applications in agriculture. However, many existing methods depend on RTK-GPS devices, which can be susceptible to loss of radio signal or intermittent reception of corrections from the internet. Consequently, research has increasingly focused on using RGB cameras for crop-row detection, though challenges persist when dealing with grown plants. This paper introduces a LiDAR-based navigation system that can achieve crop-agnostic over-canopy autonomous navigation in row-crop fields, even when the canopy fully blocks the inter-row spacing. Our algorithm can detect crop rows across diverse scenarios, encompassing various crop types, growth stages, the presence of weeds, curved rows, and discontinuities. Without utilizing a global localization method (i.e., based on GPS), our navigation system can perform autonomous navigation in these challenging scenarios, detect the end of the crop rows, and navigate to the next crop row autonomously, providing a crop-agnostic approach to navigate an entire field. The proposed navigation system has undergone tests in various simulated and real agricultural fields, achieving an average cross-track error of 3.55cm without human intervention. The system has been deployed on a customized UGV robot, which can be reconfigured depending on the field conditions.
- C. Mouël and A. Forslund, “How can we feed the world in 2050? a review of the responses from global scenario studies,” European Review of Agricultural Economics, vol. 44, 2017.
- J. P. Vasconez, G. A. Kantor, and F. A. Auat Cheein, “Human–robot interaction in agriculture: A survey and current challenges,” Biosystems Engineering, vol. 179, pp. 35–48, 2019. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S1537511017309625
- R. Moeller, T. Deemyad, and A. Sebastian, “Autonomous navigation of an agricultural robot using rtk gps and pixhawk,” in 2020 Intermountain Engineering, Technology and Computing (IETC), 2020, pp. 1–6.
- H. Lan, M. Elsheikh, W. Abdelfatah, A. Wahdan, and N. El-Sheimy, “Integrated rtk/ins navigation for precision agriculture,” Proceedings of the 32nd International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS+ 2019), pp. 4076–4086, 9 2019.
- P. V. Santhi, N. Kapileswar, V. K. R. Chenchela, and C. H. V. S. Prasad, “Sensor and vision based autonomous agribot for sowing seeds,” in 2017 International Conference on Energy, Communication, Data Analytics and Soft Computing (ICECDS), 2017, pp. 242–245.
- K. G. Fue, W. M. Porter, E. M. Barnes, and G. C. Rains, “An extensive review of mobile agricultural robotics for field operations: Focus on cotton harvesting,” AgriEngineering, vol. 2, no. 1, pp. 150–174, 2020. [Online]. Available: https://www.mdpi.com/2624-7402/2/1/10
- M. Bayati and R. Fotouhi, “A mobile robotic platform for crop monitoring,” Advances in Robotics & Automation, vol. 7, no. 1, p. 1000186, 2018.
- A. English, P. Ross, D. Ball, and P. Corke, “Vision based guidance for robot navigation in agriculture,” in 2014 IEEE International Conference on Robotics and Automation (ICRA), 2014, pp. 1693–1698.
- A. Ahmadi, M. Halstead, and C. McCool, “Towards autonomous crop-agnostic visual navigation in arable fields,” arXiv preprint arXiv:2109.11936, 2021.
- F. B. Malavazi, R. Guyonneau, J.-B. Fasquel, S. Lagrange, and F. Mercier, “Lidar-only based navigation algorithm for an autonomous agricultural robot,” Computers and Electronics in Agriculture, vol. 154, pp. 71–79, 2018. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0168169918302679
- F. Rovira-Más, V. Saiz-Rubio, and A. Cuenca-Cuenca, “Augmented perception for agricultural robots navigation,” IEEE Sensors Journal, vol. 21, no. 10, pp. 11 712–11 727, 2021.
- A. Ahmadi, M. Halstead, and C. McCool, “Towards autonomous visual navigation in arable fields,” in 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2022, pp. 6585–6592.
- A. N. Sivakumar, S. Modi, M. V. Gasparino, C. Ellis, A. E. B. Velasquez, G. Chowdhary, and S. Gupta, “Learned visual navigation for under-canopy agricultural robots,” in Robotics: Science and Systems, 2021.
- D. Guri, M. Lee, O. Kroemer, and G. Kantor, “Hefty: A modular reconfigurable robot for advancing robot manipulation in agriculture,” 2024.
- R. C. Conlter, “Implementation of the pure pursuit path tracking algorithm,” in 1990 Carnegie Mellon. Carnegie Mellon University, Janurary 1992.
- C. Pierre, “cropcraft,” https://github.com/Romea/cropcraft, 2024.
- T. Moore and D. Stouch, “A generalized extended kalman filter implementation for the robot operating system,” in Proceedings of the 13th International Conference on Intelligent Autonomous Systems (IAS-13). Springer, July 2014.