Automatic Data Processing for Space Robotics Machine Learning (2310.01932v1)
Abstract: Autonomous terrain classification is an important problem in planetary navigation, whether the goal is to identify scientific sites of interest or to traverse treacherous areas safely. Past Martian rovers have relied on human operators to manually identify a navigable path from transmitted imagery. Our goals on Mars in the next few decades will eventually require rovers that can autonomously move farther, faster, and through more dangerous landscapes--demonstrating a need for improved terrain classification for traversability. Autonomous navigation through extreme environments will enable the search for water on the Moon and Mars as well as preparations for human habitats. Advancements in machine learning techniques have demonstrated potential to improve terrain classification capabilities for ground vehicles on Earth. However, classification results for space applications are limited by the availability of training data suitable for supervised learning methods. This paper contributes an open source automatic data processing pipeline that uses camera geometry to co-locate Curiosity and Perseverance Mastcam image products with Mars overhead maps via ray projection over a terrain model. In future work, this automated data processing pipeline will be leveraged for development of machine learning methods for terrain classification.
- A. Colaprete, D. Andrews, W. Bluethmann, R. C. Elphic, B. Bussey, J. Trimble, K. Zacny, and J. E. Captain, “An overview of the volatiles investigating polar exploration rover (VIPER) mission,” in AGU Fall Meeting Abstracts, vol. 2019, Dec. 2019, pp. P34B–03.
- J. L. Vago, F. Westall, Pasteur Instrument Team, Pasteur Landing Team et al., “Habitability on early mars and the search for biosignatures with the ExoMars rover,” Astrobiology, vol. 17, no. 6-7, pp. 471–510, Jul. 2017.
- E. Gibney, “UAE ramps up space ambitions with arab world’s first moon mission,” 11 2020.
- G. Zamanakos, L. Tsochatzidis, A. Amanatiadis, and I. Pratikakis, “A comprehensive survey of LIDAR-based 3D object detection methods with deep learning for autonomous driving,” Computers & Graphics, vol. 99, pp. 153–181, 2021.
- C. E. Okereke, M. M. Mohamad, N. H. A. Wahab, O. Elijah, A. Al-Nahari, and S. Zaleha.H, “An overview of machine learning techniques in local path planning for autonomous underwater vehicles,” IEEE Access, vol. 11, pp. 24 894–24 907, 2023.
- R. Liu, F. Nageotte, P. Zanne, M. de Mathelin, and B. Dresp-Langley, “Deep reinforcement learning for the control of robotic manipulation: A focussed mini-review,” Robotics, vol. 10, no. 1, 2021.
- A. Kurobe, Y. Nakajima, K. Kitani, and H. Saito, “Audio-visual self-supervised terrain type recognition for ground mobile platforms,” IEEE Access, vol. 9, pp. 29 970–29 979, 2021.
- M. Luong and C. Pham, “Incremental learning for autonomous navigation of mobile robots based on deep reinforcement learning,” Journal of Intelligent & Robotic Systems, vol. 101, no. 1, p. 1, 2021.
- A. Azari, J. B. Biersteker, R. M. Dewey, G. Doran, E. J. Forsberg, C. D. K. Harris, H. R. Kerner, K. A. Skinner, A. W. Smith, R. Amini, S. Cambioni, V. D. Poian, T. M. Garton, M. D. Himes, S. Millholland, and S. Ruhunusiri, “Integrating machine learning for planetary science: Perspectives for the next decade,” Bulletin of the AAS, vol. 53, no. 4, mar 2021.
- Mars2020 Mast Camera Zoom Science Team, “Mars 2020 Mast Camera Zoom bundle,” 2020.
- R. G. Deen, “Mars 2020 Rover PLACES Bundle,” 2022.
- H. Alemohammad and K. Booth, “LandCoverNet: A global benchmark land cover classification training dataset,” AI for Earth Sciences Workshop at NeurIPS, 2020.
- T. Driver, K. A. Skinner, M. Dor, and P. Tsiotras, “AstroVision: Towards autonomous feature detection and description for missions to small bodies using deep learning,” Acta Astronautica, vol. 210, pp. 393–410, 2023.
- R. M. Swan, D. Atha, H. A. Leopold, M. Gildner, S. Oij, C. Chiu, and M. Ono, “AI4MARS: A dataset for terrain-aware autonomous driving on mars,” in 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 2021, pp. 1982–1991.
- C. Acton, “Ancillary data services of NASA’s navigation and ancillary information facility,” Planetary and Space Science, pp. 65–70, 1996.
- C. Acton, N. Bachman, B. Semenov, and E. Wright, “A look toward the future in the handling of space science mission geometry,” Planetary and Space Science, 2017.
- J. Wang, D. Scholes, R. E. Arvidson, S. Slavney, E. A. Guinness, L. E. Arvidson, and F. Zhou, “PDS geosciences node’s orbital data explorer for mars data access,” in Ninth International Conference on Mars, ser. LPI Contributions, LPI Editorial Board, Ed., vol. 2089, Jul. 2019, p. 6323.
- M. Bailen, R. Sucharski, S. Atkins, T. Hare, and L. Gaddis, “Using the PDS planetary image locator tool (PILOT) to identify and download spacecraft data for research,” in 44th Lunar and Planetary Science Conference Proceedings, LPI Editorial Board, Ed., 2013.
- K. Regmi and A. Borji, “Cross-view image synthesis using geometry-guided conditional gans,” Computer Vision and Image Understanding, vol. 187, p. 102788, 2019.
- S. Workman, R. Souvenir, and N. Jacobs, “Wide-area image geolocalization with aerial reference imagery,” 2015.
- X. Lu, Z. Li, Z. Cui, M. R. Oswald, M. Pollefeys, and R. Qin, “Geometry-aware satellite-to-ground image synthesis for urban areas,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), June 2020.
- T.-Y. Lin, S. Belongie, and J. Hays, “Cross-view image geolocalization,” in 2013 IEEE Conference on Computer Vision and Pattern Recognition, 2013, pp. 891–898.
- S. Workman and N. Jacobs, “On the location dependence of convolutional neural network features,” in 2015 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 2015, pp. 70–78.
- M. E. Shokr and R. Moucha, “Co-location of pixels in satellite remote sensing images with demonstrations using sea ice data,” International Journal of Remote Sensing, vol. 19, no. 5, pp. 855–869, 1998.
- M. Nachon, S. Borges, R. C. Ewing, F. Rivera‐Hernández, N. Stein, and J. K. V. Beek, “Coupling mars ground and orbital views: Generate viewsheds of mastcam images from the curiosity rover, using ArcGIS® and public datasets,” Earth and Space Science, vol. 7, 9 2020.
- R. Deen, “Mars Science Laboratory Software Interface Specification: PLACES Data Products for PDS,” 2016.
- M. C. Malin, “MSL MARS MAST CAMERA 4 RDR IMAGE V1.0,” 2013.
- T. H. Ebben, J. Bergstrom, P. Spuhler, A. Delamere, and D. Gallagher, “Mission to mars: the HiRISE camera on-board MRO,” in Focal Plane Arrays for Space Telescopes III, vol. 6690. SPIE, 2007, pp. 102–123.
- J. Maki, D. Thiessen, A. Pourangi, P. Kobzeff, T. Litwin, L. Scherr, S. Elliott, A. Dingizian, and M. Maimone, “The mars science laboratory engineering cameras,” Space Science Reviews, vol. 170, pp. 77–93, 2012.
- M. C. Malin, M. A. Ravine, M. A. Caplinger, F. Tony Ghaemi, J. A. Schaffner, J. N. Maki, J. F. Bell III, J. F. Cameron, W. E. Dietrich, K. S. Edgett, L. J. Edwards, J. B. Garvin, B. Hallet, K. E. Herkenhoff, E. Heydari, L. C. Kah, M. T. Lemmon, M. E. Minitti, T. S. Olson, T. J. Parker, S. K. Rowland, J. Schieber, R. Sletten, R. J. Sullivan, D. Y. Sumner, R. Aileen Yingst, B. M. Duston, S. McNair, and E. H. Jensen, “The mars science laboratory (MSL) mast cameras and descent imager: Investigation and instrument descriptions,” Earth and Space Science, vol. 4, no. 8, pp. 506–539, 2017.
- J. Bell, J. Maki, G. Mehall, M. Ravine, M. Caplinger, Z. Bailey, S. Brylow, J. Schaffner, K. Kinch, M. Madsen et al., “The mars 2020 perseverance rover mast camera zoom (Mastcam-Z) multispectral, stereoscopic imaging investigation,” Space Science Reviews, vol. 217, pp. 1–40, 2021.
- M. Malin, K. Edgett, E. Jensen, and L. Lipkaman, “Mars Science Laboratory Project Software Interface Specification,” 2017.
- A. M. Bailey, S. Larriva, E. Cisneros, and J. Bell, “Mars2020/Mastcam-Z Derived Product Software Interface Specification,” 2023.
- R. Beyer, “Python parameter value language (PVL) library,” in 5th Planetary Data Workshop & Planetary Science Informatics & Analytics, vol. 2549, 2021, p. 7037.
- R. Deen, “MSL MARS ROVER 6 RDR PLACES ROVER MOTION COUNTER V1.0,” 2015.
- C. Million, M. St Clair, K. Aye, and J. Padams, “The planetary data reader (PDR): A python toolkit for reading planetary data,” in 5th Planetary Data Workshop & Planetary Science Informatics & Analytics, vol. 2549, 2021, p. 7096.
- Z. Cuckovic, “Advanced viewshed analysis: a quantum gis plug-in for the analysis of visual landscapes,” Journal of Open Source Software, vol. 1, no. 4, p. 32, 2016.