Pheno-Robot: An Auto-Digital Modelling System for In-Situ Phenotyping in the Field (2402.09685v1)
Abstract: Accurate reconstruction of plant models for phenotyping analysis is critical for optimising sustainable agricultural practices in precision agriculture. Traditional laboratory-based phenotyping, while valuable, falls short of understanding how plants grow under uncontrolled conditions. Robotic technologies offer a promising avenue for large-scale, direct phenotyping in real-world environments. This study explores the deployment of emerging robotics and digital technology in plant phenotyping to improve performance and efficiency. Three critical functional modules: environmental understanding, robotic motion planning, and in-situ phenotyping, are introduced to automate the entire process. Experimental results demonstrate the effectiveness of the system in agricultural environments. The pheno-robot system autonomously collects high-quality data by navigating around plants. In addition, the in-situ modelling model reconstructs high-quality plant models from the data collected by the robot. The developed robotic system shows high efficiency and robustness, demonstrating its potential to advance plant science in real-world agricultural environments.
- L. Fu, F. Gao, J. Wu, R. Li, M. Karkee, and Q. Zhang, “Application of consumer rgb-d cameras for fruit detection and localization in field: A critical review,” Computers and Electronics in Agriculture, vol. 177, p. 105687, 2020.
- S. Wu, W. Wen, Y. Wang, J. Fan, C. Wang, W. Gou, and X. Guo, “Mvs-pheno: a portable and low-cost phenotyping platform for maize shoots using multiview stereo 3d reconstruction,” Plant Phenomics, 2020.
- D. M. Deery and H. G. Jones, “Field phenomics: Will it enable crop improvement?” Plant Phenomics, 2021.
- Y. Zhang and N. Zhang, “Imaging technologies for plant high-throughput phenotyping: a review,” Frontiers of Agricultural Science and Engineering, vol. 5, no. 4, pp. 406–419, 2018.
- R. Xu and C. Li, “A modular agricultural robotic system (mars) for precision farming: Concept and implementation,” Journal of Field Robotics, vol. 39, no. 4, pp. 387–409, 2022.
- B. Dellen, H. Scharr, and C. Torras, “Growth signatures of rosette plants from time-lapse video,” IEEE/ACM Transactions on Computational Biology and Bioinformatics, vol. 12, no. 6, pp. 1470–1478, 2015.
- L. Feng, S. Chen, C. Zhang, Y. Zhang, and Y. He, “A comprehensive review on recent applications of unmanned aerial vehicle remote sensing with various sensors for high-throughput plant phenotyping,” Computers and electronics in agriculture, vol. 182, p. 106033, 2021.
- S. Paulus, “Measuring crops in 3d: using geometry for plant phenotyping,” Plant methods, vol. 15, no. 1, pp. 1–13, 2019.
- N. Virlet, K. Sabermanesh, P. Sadeghi-Tehran, and M. J. Hawkesford, “Field scanalyzer: An automated robotic field phenotyping platform for detailed crop monitoring,” Functional Plant Biology, vol. 44, no. 1, pp. 143–153, 2016.
- S. Paulus, H. Schumann, H. Kuhlmann, and J. Léon, “High-precision laser scanning system for capturing 3d plant architecture and analysing growth of cereal plants,” Biosystems Engineering, vol. 121, pp. 1–11, 2014.
- D. Xiong, D. Wang, X. Liu, S. Peng, J. Huang, and Y. Li, “Leaf density explains variation in leaf mass per area in rice between cultivars and nitrogen treatments,” Annals of Botany, vol. 117, no. 6, pp. 963–971, 2016.
- M. S. A. Mahmud, M. S. Z. Abidin, A. A. Emmanuel, and H. S. Hasan, “Robotics and automation in agriculture: present and future applications,” Applications of Modelling and Simulation, vol. 4, pp. 130–140, 2020.
- T. Duckett, S. Pearson, S. Blackmore, B. Grieve, W.-H. Chen, G. Cielniak, J. Cleaversmith, J. Dai, S. Davis, C. Fox et al., “Agricultural robotics: the future of robotic agriculture,” arXiv preprint arXiv:1806.06762, 2018.
- W. Au, H. Zhou, T. Liu, E. Kok, X. Wang, M. Wang, and C. Chen, “The monash apple retrieving system: a review on system intelligence and apple harvesting performance,” Computers and Electronics in Agriculture, vol. 213, p. 108164, 2023.
- D. Albani, J. IJsselmuiden, R. Haken, and V. Trianni, “Monitoring and mapping with robot swarms for agricultural applications,” in 2017 14th IEEE International Conference on Advanced Video and Signal Based Surveillance (AVSS). IEEE, 2017, pp. 1–6.
- M. Jansen, F. Gilmer, B. Biskup, K. A. Nagel, U. Rascher, A. Fischbach, S. Briem, G. Dreissen, S. Tittmann, S. Braun et al., “Simultaneous phenotyping of leaf growth and chlorophyll fluorescence via growscreen fluoro allows detection of stress tolerance in arabidopsis thaliana and other rosette plants,” Functional Plant Biology, vol. 36, no. 11, pp. 902–914, 2009.
- H. Kang, H. Zhou, X. Wang, and C. Chen, “Real-time fruit recognition and grasping estimation for robotic apple harvesting,” Sensors, vol. 20, no. 19, p. 5670, 2020.
- H. Kang, X. Wang, and C. Chen, “Accurate fruit localisation using high resolution lidar-camera fusion and instance segmentation,” Computers and Electronics in Agriculture, vol. 203, p. 107450, 2022.
- Y. Pan, H. Cao, K. Hu, H. Kang, and X. Wang, “A novel perception and semantic mapping method for robot autonomy in orchards,” arXiv e-prints, pp. arXiv–2308, 2023.
- M. Zucker, N. Ratliff, A. D. Dragan, M. Pivtoraiko, M. Klingensmith, C. M. Dellin, J. A. Bagnell, and S. S. Srinivasa, “Chomp: Covariant hamiltonian optimization for motion planning,” The International journal of robotics research, vol. 32, no. 9-10, pp. 1164–1193, 2013.
- C. Rösmann, F. Hoffmann, and T. Bertram, “Kinodynamic trajectory optimization and control for car-like robots,” in 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2017, pp. 5681–5686.
- W. Zhang, J. Qi, P. Wan, H. Wang, D. Xie, X. Wang, and G. Yan, “An easy-to-use airborne lidar data filtering method based on cloth simulation,” Remote sensing, vol. 8, no. 6, p. 501, 2016.
- Y. Cai, W. Xu, and F. Zhang, “ikd-tree: An incremental kd tree for robotic applications,” arXiv preprint arXiv:2102.10808, 2021.
- B. Mildenhall, P. P. Srinivasan, M. Tancik, J. T. Barron, R. Ramamoorthi, and R. Ng, “Nerf: Representing scenes as neural radiance fields for view synthesis,” Communications of the ACM, vol. 65, no. 1, pp. 99–106, 2021.
- J. Yang, M. Pavone, and Y. Wang, “Freenerf: Improving few-shot neural rendering with free frequency regularization,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2023, pp. 8254–8263.
- B. Lévy, S. Petitjean, N. Ray, and J. Maillot, “Least squares conformal maps for automatic texture atlas generation,” in Seminal Graphics Papers: Pushing the Boundaries, Volume 2, 2023, pp. 193–202.
- T. Müller, A. Evans, C. Schied, and A. Keller, “Instant neural graphics primitives with a multiresolution hash encoding,” ACM Transactions on Graphics (ToG), vol. 41, no. 4, pp. 1–15, 2022.