Gradient-based Local Next-best-view Planning for Improved Perception of Targeted Plant Nodes (2311.16759v2)
Abstract: Robots are increasingly used in tomato greenhouses to automate labour-intensive tasks such as selective harvesting and de-leafing. To perform these tasks, robots must be able to accurately and efficiently perceive the plant nodes that need to be cut, despite the high levels of occlusion from other plant parts. We formulate this problem as a local next-best-view (NBV) planning task where the robot has to plan an efficient set of camera viewpoints to overcome occlusion and improve the quality of perception. Our formulation focuses on quickly improving the perception accuracy of a single target node to maximise its chances of being cut. Previous methods of NBV planning mostly focused on global view planning and used random sampling of candidate viewpoints for exploration, which could suffer from high computational costs, ineffective view selection due to poor candidates, or non-smooth trajectories due to inefficient sampling. We propose a gradient-based NBV planner using differential ray sampling, which directly estimates the local gradient direction for viewpoint planning to overcome occlusion and improve perception. Through simulation experiments, we showed that our planner can handle occlusions and improve the 3D reconstruction and position estimation of nodes equally well as a sampling-based NBV planner, while taking ten times less computation and generating 28% more efficient trajectories.
- R. R Shamshiri, C. Weltzien, I. A. Hameed, I. J Yule, T. E Grift, S. K. Balasundram, L. Pitonakova, D. Ahmad, and G. Chowdhary, “Research and development in agricultural robotics: A perspective of digital farming,” 2018.
- L. F. Oliveira, A. P. Moreira, and M. F. Silva, “Advances in agriculture robotics: A state-of-the-art review and challenges ahead,” Robotics, vol. 10, no. 2, p. 52, 2021.
- C. W. Bac, J. Hemming, B. Van Tuijl, R. Barth, E. Wais, and E. J. van Henten, “Performance evaluation of a harvesting robot for sweet pepper,” Journal of Field Robotics, vol. 34, no. 6, pp. 1123–1139, 2017.
- G. Kootstra, X. Wang, P. M. Blok, J. Hemming, and E. Van Henten, “Selective harvesting robotics: current research, trends, and future directions,” Current Robotics Reports, vol. 2, no. 1, pp. 95–104, 2021.
- S. Chen, Y. Li, and N. M. Kwok, “Active vision in robotic systems: A survey of recent developments,” The International Journal of Robotics Research, vol. 30, no. 11, pp. 1343–1377, 2011.
- J. A. Placed, J. Strader, H. Carrillo, N. Atanasov, V. Indelman, L. Carlone, and J. A. Castellanos, “A survey on active simultaneous localization and mapping: State of the art and new frontiers,” IEEE Transactions on Robotics, 2023.
- R. Zeng, Y. Wen, W. Zhao, and Y.-J. Liu, “View planning in robot active vision: A survey of systems, algorithms, and applications,” Computational Visual Media, vol. 6, pp. 225–245, 2020.
- B. Yamauchi, “A frontier-based approach for autonomous exploration,” in Proceedings 1997 IEEE International Symposium on Computational Intelligence in Robotics and Automation CIRA’97.’Towards New Computational Principles for Robotics and Automation’. IEEE, 1997, pp. 146–151.
- A. Bircher, M. Kamel, K. Alexis, H. Oleynikova, and R. Siegwart, “Receding horizon” next-best-view” planner for 3d exploration,” in 2016 IEEE international conference on robotics and automation (ICRA). IEEE, 2016, pp. 1462–1468.
- C. Papachristos, S. Khattak, and K. Alexis, “Uncertainty-aware receding horizon exploration and mapping using aerial robots,” in 2017 IEEE international conference on robotics and automation (ICRA). IEEE, 2017, pp. 4568–4575.
- S. A. Kay, S. Julier, and V. M. Pawar, “Semantically informed next best view planning for autonomous aerial 3d reconstruction,” in 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2021, pp. 3125–3130.
- Z. Meng, H. Qin, Z. Chen, X. Chen, H. Sun, F. Lin, and M. H. Ang, “A two-stage optimized next-view planning framework for 3-d unknown environment exploration, and structural reconstruction,” IEEE Robotics and Automation Letters, vol. 2, no. 3, pp. 1680–1687, 2017.
- A. Dai, S. Papatheodorou, N. Funk, D. Tzoumanikas, and S. Leutenegger, “Fast frontier-based information-driven autonomous exploration with an mav,” in 2020 IEEE international conference on robotics and automation (ICRA). IEEE, 2020, pp. 9570–9576.
- C. Lehnert, D. Tsai, A. Eriksson, and C. McCool, “3d move to see: Multi-perspective visual servoing towards the next best view within unstructured and occluded environments,” in 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2019, pp. 3890–3897.
- D. Deng, Z. Xu, W. Zhao, and K. Shimada, “Frontier-based automatic-differentiable information gain measure for robotic exploration of unknown 3d environments,” arXiv preprint arXiv:2011.05288, 2020.
- J. A. Gibbs, M. Pound, A. French, D. Wells, E. Murchie, and T. Pridmore, “Active vision and surface reconstruction for 3d plant shoot modelling,” IEEE/ACM transactions on computational biology and bioinformatics, 2019.
- T. Zaenker, C. Smitt, C. McCool, and M. Bennewitz, “Viewpoint planning for fruit size and position estimation,” in 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2021, pp. 3271–3277.
- S. Marangoz, T. Zaenker, R. Menon, and M. Bennewitz, “Fruit mapping with shape completion for autonomous crop monitoring,” in 2022 IEEE 18th International Conference on Automation Science and Engineering (CASE), 2022, pp. 471–476.
- J. Rückin, L. Jin, and M. Popović, “Adaptive informative path planning using deep reinforcement learning for uav-based active sensing,” in 2022 International Conference on Robotics and Automation (ICRA). IEEE, 2022, pp. 4473–4479.
- A. K. Burusa, E. J. van Henten, and G. Kootstra, “Attention-driven active vision for efficient reconstruction of plants and targeted plant parts,” arXiv preprint arXiv:2206.10274, 2022.
- A. K. Burusa, J. Scholten, D. R. Rincon, X. Wang, E. J. van Henten, and G. Kootstra, “Efficient search and detection of relevant plant parts using semantics-aware active vision,” arXiv preprint arXiv:2306.09801, 2023.
- K. He, G. Gkioxari, P. Dollár, and R. Girshick, “Mask r-cnn,” in Proceedings of the IEEE international conference on computer vision, 2017, pp. 2961–2969.
- “Gazebo,” https://gazebosim.org/, [Accessed 15-09-2023].
- A. Hornung, K. M. Wurm, M. Bennewitz, C. Stachniss, and W. Burgard, “Octomap: An efficient probabilistic 3d mapping framework based on octrees,” Autonomous robots, vol. 34, no. 3, pp. 189–206, 2013.
- M. Jaderberg, K. Simonyan, A. Zisserman et al., “Spatial transformer networks,” Advances in neural information processing systems, vol. 28, 2015.
- A. Paszke, S. Gross, F. Massa, A. Lerer, J. Bradbury, G. Chanan, T. Killeen, Z. Lin, N. Gimelshein, L. Antiga et al., “Pytorch: An imperative style, high-performance deep learning library,” Advances in neural information processing systems, vol. 32, 2019.