Papers
Topics
Authors
Recent
2000 character limit reached

MMCBE: Multi-modality Dataset for Crop Biomass Prediction and Beyond

Published 17 Apr 2024 in cs.CV | (2404.11256v3)

Abstract: Crop biomass, a critical indicator of plant growth, health, and productivity, is invaluable for crop breeding programs and agronomic research. However, the accurate and scalable quantification of crop biomass remains inaccessible due to limitations in existing measurement methods. One of the obstacles impeding the advancement of current crop biomass prediction methodologies is the scarcity of publicly available datasets. Addressing this gap, we introduce a new dataset in this domain, i.e. Multi-modality dataset for crop biomass estimation (MMCBE). Comprising 216 sets of multi-view drone images, coupled with LiDAR point clouds, and hand-labelled ground truth, MMCBE represents the first multi-modality one in the field. This dataset aims to establish benchmark methods for crop biomass quantification and foster the development of vision-based approaches. We have rigorously evaluated state-of-the-art crop biomass estimation methods using MMCBE and ventured into additional potential applications, such as 3D crop reconstruction from drone imagery and novel-view rendering. With this publication, we are making our comprehensive dataset available to the broader community.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (43)
  1. L. Duncanson, J. R. Kellner, J. Armston, R. Dubayah, D. M. M. …, and C. Zgraggen, “Aboveground biomass density models for nasa’s global ecosystem dynamics investigation (gedi) lidar mission,” Remote Sensing of Environment, vol. 270, p. 112845, 2022. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0034425721005654
  2. E. L. Bullock, S. P. Healey, Z. Yang, R. Acosta, H. Villalba, K. P. Insfrán, J. Melo, S. Wilson, L. I. Duncanson, E. Næsset et al., “Estimating aboveground biomass density using hybrid statistical inference with gedi lidar data and paraguay’s national forest inventory,” Environmental Research Letters, 2023.
  3. L. Pan, L. Liu, A. G. Condon, G. M. Estavillo, R. A. Coe, G. Bull, E. A. Stone, L. Petersson, and V. Rolland, “Biomass prediction with 3d point clouds from lidar,” in Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, 2022, pp. 1330–1340.
  4. E. L. Loudermilk, J. K. Hiers, J. J. O’Brien, R. J. Mitchell, A. Singhania, J. C. Fernandez, W. P. Cropper, and K. C. Slatton, “Ground-based lidar: a novel approach to quantify fine-scale fuelbed characteristics,” International Journal of Wildland Fire, vol. 18, no. 6, pp. 676–685, 2009.
  5. J. D. Walter, J. Edwards, G. McDonald, and H. Kuchel, “Estimating biomass and canopy height with lidar for field crop breeding,” Frontiers in plant science, vol. 10, p. 1145, 2019.
  6. J. A. Jimenez-Berni, D. M. Deery, P. Rozas-Larraondo, A. T. G. Condon, G. J. Rebetzke, R. A. James, W. D. Bovill, R. T. Furbank, and X. R. Sirault, “High throughput determination of plant height, ground cover, and above-ground biomass in wheat with lidar,” Frontiers in plant science, vol. 9, p. 237, 2018.
  7. S. Liu, X. Li, H. Wu, B. Xin, J. Tang, P. R. Petrie, and M. Whitty, “A robust automated flower estimation system for grape vines,” Biosystems Engineering, vol. 172, pp. 110–123, 2018.
  8. J. Caballer Revenga, K. Trepekli, S. Oehmcke, F. Gieseke, C. Igel, R. Jensen, and T. Friborg, “Prediction of above ground biomass and c-stocks based on uav-lidar, multispectral imagery and machine learning methods.” in EGU General Assembly Conference Abstracts, 2021, pp. EGU21–15 708.
  9. S. Sun, C. Li, A. H. Paterson, Y. Jiang, R. Xu, J. S. Robertson, J. L. Snider, and P. W. Chee, “In-field high throughput phenotyping and cotton plant growth analysis using lidar,” Frontiers in Plant Science, vol. 9, p. 16, 2018.
  10. J. R. R. Polo, R. Sanz, J. Llorens, J. Arnó, A. Escola, M. Ribes-Dasi, J. Masip, F. Camp, F. Gracia, F. Solanelles et al., “A tractor-mounted scanning lidar for the non-destructive measurement of vegetative volume and surface area of tree-row plantations: A comparison with conventional destructive measurements,” Biosystems engineering, vol. 102, no. 2, pp. 128–134, 2009.
  11. X. Li, J. Guivant, and S. Khan, “Real-time 3d object proposal generation and classification using limited processing resources,” Robotics and Autonomous Systems, vol. 130, p. 103557, 2020.
  12. J. ten Harkel, H. Bartholomeus, and L. Kooistra, “Biomass and crop height estimation of different crops using uav-based lidar,” Remote Sensing, vol. 12, no. 1, p. 17, 2019.
  13. Z. Ma, Y. Cao, R. Rayhana, Z. Liu, G. G. Xiao, Y. Ruan, and J. S. Sangha, “Automated biomass estimation through depth measurement with an oak-d camera,” in 2023 IEEE International Symposium on Robotic and Sensors Environments (ROSE).   IEEE, 2023, pp. 1–6.
  14. J. Deng, W. Dong, R. Socher, L.-J. Li, K. Li, and L. Fei-Fei, “Imagenet: A large-scale hierarchical image database,” in 2009 IEEE conference on computer vision and pattern recognition.   Ieee, 2009, pp. 248–255.
  15. T.-Y. Lin, M. Maire, S. Belongie, J. Hays, P. Perona, D. Ramanan, P. Dollár, and C. L. Zitnick, “Microsoft coco: Common objects in context,” in Computer Vision–ECCV 2014: 13th European Conference, Zurich, Switzerland, September 6-12, 2014, Proceedings, Part V 13.   Springer, 2014, pp. 740–755.
  16. S. Ren, K. He, R. Girshick, and J. Sun, “Faster r-cnn: Towards real-time object detection with region proposal networks,” Advances in neural information processing systems, vol. 28, 2015.
  17. X. Li, N. Kwok, J. E. Guivant, K. Narula, R. Li, and H. Wu, “Detection of imaged objects with estimated scales.” in VISIGRAPP (5: VISAPP), 2019, pp. 39–47.
  18. X. Li and J. E. Guivant, “Efficient and accurate object detection with simultaneous classification and tracking under limited computing power,” IEEE Transactions on Intelligent Transportation Systems, 2023.
  19. K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” in CVPR, June 2016.
  20. A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, Ł. Kaiser, and I. Polosukhin, “Attention is all you need,” Advances in neural information processing systems, vol. 30, 2017.
  21. X. Li, J. Guivant, N. Kwok, Y. Xu, R. Li, and H. Wu, “Three-dimensional backbone network for 3d object detection in traffic scenes,” arXiv preprint arXiv:1901.08373, 2019.
  22. Y. Zhao, Y. Sun, Q. Zhang, Q. Zhang, and J. Hou, “A method for evaluating indoor phthalate exposure levels in collected dust,” MethodsX, vol. 8, p. 101187, 2021.
  23. W. R. CATCHPOLE and C. J. WHEELER, “Estimating plant biomass: A review of techniques,” Austral Ecology, vol. 17, no. 2, pp. 121–131, jun 1992.
  24. R. Gebbers, D. Ehlert, and R. Adamek, “Rapid mapping of the leaf area index in agricultural crops,” Agronomy Journal, vol. 103, no. 5, pp. 1532–1541, 2011.
  25. N. Tilly, D. Hoffmeister, Q. Cao, S. Huang, V. Lenz-Wiedemann, Y. Miao, and G. Bareth, “Multitemporal crop surface models: accurate plant height measurement and biomass estimation with terrestrial laser scanning in paddy rice,” Journal of Applied Remote Sensing, vol. 8, no. 1, pp. 083 671–083 671, 2014.
  26. M. Shu, Q. Li, A. Ghafoor, J. Zhu, B. Li, and Y. Ma, “Using the plant height and canopy coverage to estimation maize aboveground biomass with uav digital images,” European Journal of Agronomy, vol. 151, p. 126957, 2023.
  27. W. Saeys, B. Lenaerts, G. Craessaerts, and J. De Baerdemaeker, “Estimation of the crop density of small grains using lidar sensors,” Biosystems Engineering, vol. 102, no. 1, pp. 22–30, 2009.
  28. N. Tilly, H. Aasen, and G. Bareth, “Fusion of plant height and vegetation indices for the estimation of barley biomass,” Remote Sensing, vol. 7, no. 9, pp. 11 449–11 480, 2015.
  29. W. Li, Z. Niu, N. Huang, C. Wang, S. Gao, and C. Wu, “Airborne lidar technique for estimating biomass components of maize: A case study in zhangye city, northwest china,” Ecological indicators, vol. 57, pp. 486–496, 2015.
  30. H. Aasen, A. Burkart, A. Bolten, and G. Bareth, “Generating 3d hyperspectral information with lightweight uav snapshot cameras for vegetation monitoring: From camera calibration to quality assurance,” ISPRS Journal of Photogrammetry and Remote Sensing, vol. 108, pp. 245–259, 2015.
  31. M. G. Salas Fernandez, Y. Bao, L. Tang, and P. S. Schnable, “A high-throughput, field-based phenotyping technology for tall biomass crops,” Plant physiology, vol. 174, no. 4, pp. 2008–2022, 2017.
  32. H. E. Greaves, L. A. Vierling, J. U. Eitel, N. T. Boelman, T. S. Magney, C. M. Prager, and K. L. Griffin, “Estimating aboveground biomass and leaf area of low-stature arctic shrubs with terrestrial lidar,” Remote Sensing of Environment, vol. 164, pp. 26–35, 2015.
  33. S. Oehmcke, L. Li, J. C. Revenga, T. Nord-Larsen, K. Trepekli, F. Gieseke, and C. Igel, “Deep learning based 3d point cloud regression for estimating forest biomass,” in Proceedings of the 30th International Conference on Advances in Geographic Information Systems, 2022, pp. 1–4.
  34. D. Wang, B. Wan, P. Qiu, Z. Zuo, R. Wang, and X. Wu, “Mapping height and aboveground biomass of mangrove forests on hainan island using uav-lidar sampling,” Remote Sensing, vol. 11, no. 18, p. 2156, 2019.
  35. B. Li, X. Xu, L. Zhang, J. Han, C. Bian, G. Li, J. Liu, and L. Jin, “Above-ground biomass estimation and yield prediction in potato by using uav-based rgb and hyperspectral imaging,” ISPRS Journal of Photogrammetry and Remote Sensing, vol. 162, pp. 161–172, 2020.
  36. A. F. Colaço, M. Schaefer, and R. G. Bramley, “Broadacre mapping of wheat biomass using ground-based lidar technology,” Remote Sensing, vol. 13, no. 16, p. 3218, 2021.
  37. J. L. Schonberger and J.-M. Frahm, “Structure-from-motion revisited,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 2016, pp. 4104–4113.
  38. C. R. Qi, H. Su, K. Mo, and L. J. Guibas, “Pointnet: Deep learning on point sets for 3d classification and segmentation,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 2017, pp. 652–660.
  39. C. R. Qi, L. Yi, H. Su, and L. J. Guibas, “Pointnet++: Deep hierarchical feature learning on point sets in a metric space,” Advances in neural information processing systems, vol. 30, 2017.
  40. Y. Wang, Y. Sun, Z. Liu, S. E. Sarma, M. M. Bronstein, and J. M. Solomon, “Dynamic graph cnn for learning on point clouds,” ACM Transactions on Graphics (tog), vol. 38, no. 5, pp. 1–12, 2019.
  41. P. Wang, L. Liu, Y. Liu, C. Theobalt, T. Komura, and W. Wang, “Neus: Learning neural implicit surfaces by volume rendering for multi-view reconstruction,” CoRR, vol. abs/2106.10689, 2021. [Online]. Available: https://arxiv.org/abs/2106.10689
  42. S. Wang, V. Leroy, Y. Cabon, B. Chidlovskii, and J. Revaud, “Dust3r: Geometric 3d vision made easy,” arXiv preprint arXiv:2312.14132, 2023.
  43. B. Mildenhall, P. P. Srinivasan, M. Tancik, J. T. Barron, R. Ramamoorthi, and R. Ng, “Nerf: Representing scenes as neural radiance fields for view synthesis,” Communications of the ACM, vol. 65, no. 1, pp. 99–106, 2021.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Paper to Video (Beta)

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 0 likes about this paper.