Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
129 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

FRACTAL: An Ultra-Large-Scale Aerial Lidar Dataset for 3D Semantic Segmentation of Diverse Landscapes (2405.04634v4)

Published 7 May 2024 in cs.CV and cs.LG

Abstract: Mapping agencies are increasingly adopting Aerial Lidar Scanning (ALS) as a new tool to map buildings and other above-ground structures. Processing ALS data at scale requires efficient point classification methods that perform well over highly diverse territories. Large annotated Lidar datasets are needed to evaluate these classification methods, however, current Lidar benchmarks have restricted scope and often cover a single urban area. To bridge this data gap, we introduce the FRench ALS Clouds from TArgeted Landscapes (FRACTAL) dataset: an ultra-large-scale aerial Lidar dataset made of 100,000 dense point clouds with high quality labels for 7 semantic classes and spanning 250 km$2$. FRACTAL achieves high spatial and semantic diversity by explicitly sampling rare classes and challenging landscapes from five different regions of France. We describe the data collection, annotation, and curation process of the dataset. We provide baseline semantic segmentation results using a state of the art 3D point cloud classification model. FRACTAL aims to support the development of 3D deep learning approaches for large-scale land monitoring.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (31)
  1. M. Melin, A. C. Shapiro, and P. Glover-Kapfer, “LiDAR for ecology and conservation - WWF conservation technology series (3),” pp. 24–25, 2017. [Online]. Available: www.researchgate.net/publication/320347018
  2. N. A. Muhadi, A. F. Abdullah, S. K. Bejo, M. R. Mahadi, and A. Mijic, “The use of LiDAR-derived DEM in flood applications: A review,” 7 2020. [Online]. Available: www.mdpi.com/2072-4292/12/14/2308
  3. M. Yebra, S. Marselis, A. van Dijk, G. Cary, and Y. Chen, “Using Lidar for forest and fuel structure mapping: Options, benefits, requirements and costs,” 2015. [Online]. Available: www.bnhcrc.com.au/sites/default/files/managed/downloads/using_lidar_for_forest_and_fuel_structure_mapping_final.pdf
  4. G. Kakoulaki, A. Martinez, and F. Petro, “Non-commercial Light Detection and Ranging (LiDAR) data in Europe,” 2021. [Online]. Available: https://doi.org/10.2760/212427
  5. Institut national de l’information géographique et forestière (IGN), “Lidar HD [Database],” 1 2023. [Online]. Available: www.geoservices.ign.fr/documentation/donnees/alti/lidarhd
  6. “TopoDOT [software].” [Online]. Available: www.topodot.com
  7. “TerraScan [software].” [Online]. Available: terrasolid.com/products/terrascan
  8. K. Yen, “Automated LiDAR extraction software,” 2021. [Online]. Available: https://dot.ca.gov/-/media/dot-media/programs/research-innovation-system-information/documents/preliminary-investigations/pi-0324-lidarsoftwarepi.pdf
  9. C. R. Qi, H. Su, K. Mo, and L. J. Guibas, “PointNet: Deep learning on point sets for 3D classification and segmentation,” Arxiv, 12 2016. [Online]. Available: https://arxiv.org/abs/1612.00593
  10. C. R. Qi, L. Yi, H. Su, and L. J. Guibas, “PointNet++: Deep hierarchical feature learning on point sets in a metric space,” Arxiv, 6 2017. [Online]. Available: https://arxiv.org/abs/1706.02413
  11. C. Gaydon and F. Roche, “PureForest: A large-scale aerial lidar and aerial imagery dataset for tree species classification in monospecific forests,” 4 2024. [Online]. Available: https://arxiv.org/abs/2404.12064
  12. N. Varney, V. K. Asari, and Q. Graehling, “DALES: A large-scale aerial LiDAR data set for semantic segmentation,” ArXiv, 4 2020. [Online]. Available: http://arxiv.org/abs/2004.11985
  13. P. Zachar, K. Bakuła, and W. Ostrowski, “CENAGIS-ALS Benchmark - new proposal for dense ALS benchmark based on the review of datasets and benchmarks for 3D point cloud segmentation,” vol. 48.   International Society for Photogrammetry and Remote Sensing, 10 2023, pp. 227–234. [Online]. Available: https://doi.org/10.5194/isprs-archives-XLVIII-1-W3-2023-227-2023
  14. N. Qin, W. Tan, L. Ma, D. Zhang, and J. Li, “OpenGF: An ultra-large-scale ground filtering dataset built upon open ALS point clouds around the world.” [Online]. Available: https://arxiv.org/abs/2101.09641
  15. J. Niemeyer, F. Rottensteiner, and U. Soergel, “Contextual classification of lidar data and building object detection in urban areas,” ISPRS Journal of Photogrammetry and Remote Sensing, vol. 87, pp. 152–165, 1 2014. [Online]. Available: https://doi.org/10.1016/j.isprsjprs.2013.11.001
  16. S. M. I. Zolanvari, S. Ruano, A. Rana, A. Cummins, R. E. da Silva, M. Rahbar, and A. Smolic, “DublinCity: Annotated LiDAR point cloud and its applications,” 9 2019. [Online]. Available: http://arxiv.org/abs/1909.03613
  17. Z. Ye, Y. Xu, R. Huang, X. Tong, X. Li, X. Liu, K. Luan, L. Hoegner, and U. Stilla, “LASDU: A large-scale aerial LiDAR dataset for semantic labeling in dense urban areas,” ISPRS International Journal of Geo-Information, vol. 9, 7 2020. [Online]. Available: https://doi.org/10.3390/ijgi9070450
  18. M. Kölle, D. Laupheimer, S. Schmohl, N. Haala, F. Rottensteiner, J. D. Wegner, and H. Ledoux, “The Hessigheim 3D (H3D) benchmark on semantic segmentation of high-resolution 3D point clouds and textured meshes from UAV LiDAR and Multi-View-Stereo,” ISPRS Open Journal of Photogrammetry and Remote Sensing, vol. 1, p. 100001, 10 2021. [Online]. Available: https://doi.org/10.1016/j.ophoto.2021.100001
  19. Institut national de l’information géographique et forestière (IGN), “LiDAR HD version 1.0 - descriptif de contenu des nuages de points LiDAR,” 10 2023. [Online]. Available: www.geoservices.ign.fr/sites/default/files/2023-10/DC_LiDAR_HD_1-0_PTS.pdf
  20. ——, “ORTHO HR [Database],” 1 2023. [Online]. Available: www.geoservices.ign.fr/bdortho
  21. T. Kattenborn, F. Schiefer, J. Frey, H. Feilhauer, M. D. Mahecha, and C. F. Dormann, “Spatially autocorrelated training and validation samples inflate performance assessment of convolutional neural networks,” ISPRS Open Journal of Photogrammetry and Remote Sensing, vol. 5, 8 2022. [Online]. Available: https://doi.org/10.1016/j.ophoto.2022.100018
  22. Institut national de l’information géographique et forestière (IGN), “Bd topo® [database],” 2024. [Online]. Available: www.geoservices.ign.fr/bdtopo
  23. The American Society for Photogrammetry and Remote Sensing, “LAS specification 1.4 - R15,” 2019. [Online]. Available: www.asprs.org/wp-content/uploads/2019/07/LAS_1_4_r15.pdf
  24. C. Gaydon, “Myria3D: Deep learning for the semantic segmentation of aerial Lidar point clouds [software],” 2022. [Online]. Available: www.github.com/IGNF/myria3d
  25. W. Falcon and T. P. L. team, “PyTorch Lightning [software],” 3 2019. [Online]. Available: www.github.com/Lightning-AI/lightning
  26. M. Fey and J. E. Lenssen, “Fast graph representation learning with PyTorch Geometric,” 2019. [Online]. Available: www.github.com/pyg-team/pytorch_geometric
  27. Q. Hu, B. Yang, L. Xie, S. Rosa, Y. Guo, Z. Wang, N. Trigoni, and A. Markham, “RandLA-Net: Efficient semantic segmentation of large-scale point clouds,” Arxiv, 11 2019. [Online]. Available: https://arxiv.org/abs/1911.11236
  28. J. Behley, M. Garbade, A. Milioto, J. Quenzel, S. Behnke, C. Stachniss, and J. Gall, “SemanticKITTI: A dataset for semantic scene understanding of LiDAR sequences,” 2019. [Online]. Available: https://arxiv.org/abs/1904.01416
  29. T. Hackel, N. Savinov, L. Ladicky, J. D. Wegner, K. Schindler, and M. Pollefeys, “SEMANTIC3D.NET: A new large-scale point cloud classification benchmark,” vol. IV-1-W1, 2017, pp. 91–98. [Online]. Available: http://www.semantic3d.net
  30. Comet ML, “Comet [ML platform],” 2024. [Online]. Available: www.comet.com
  31. A. Garioud, N. Gonthier, L. Landrieu, A. D. Wit, M. Valette, M. Poupée, S. Giordano, and B. Wattrelos, “FLAIR: a country-scale land cover semantic segmentation dataset from multi-source optical imagery,” 2023. [Online]. Available: https://doi.org/10.48550/arXiv.2310.13336

Summary

We haven't generated a summary for this paper yet.