Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Quick unsupervised hyperspectral dimensionality reduction for earth observation: a comparison (2402.16566v1)

Published 26 Feb 2024 in eess.IV

Abstract: Dimensionality reduction can be applied to hyperspectral images so that the most useful data can be extracted and processed more quickly. This is critical in any situation in which data volume exceeds the capacity of the computational resources, particularly in the case of remote sensing platforms (e.g., drones, satellites), but also in the case of multi-year datasets. Moreover, the computational strategies of unsupervised dimensionality reduction often provide the basis for more complicated supervised techniques. Seven unsupervised dimensionality reduction algorithms are tested on hyperspectral data from the HYPSO-1 earth observation satellite. Each particular algorithm is chosen to be representative of a broader collection. The experiments probe the computational complexity, reconstruction accuracy, signal clarity, sensitivity to artifacts, and effects on target detection and classification of the different algorithms. No algorithm consistently outperformed the others across all tests, but some general trends regarding the characteristics of the algorithms did emerge. With half a million pixels, computational time requirements of the methods varied by 5 orders of magnitude, and the reconstruction error varied by about 3 orders of magnitude. A relationship between mutual information and artifact susceptibility was suggested by the tests. The relative performance of the algorithms differed significantly between the target detection and classification tests. Overall, these experiments both show the power of dimensionality reduction and give guidance regarding how to evaluate a technique prior to incorporating it into a processing pipeline.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (112)
  1. Gordon Hughes. On the mean accuracy of statistical pattern recognizers. IEEE transactions on information theory, 14(1):55–63, 1968.
  2. Eo-1/hyperion hyperspectral imager design, development, characterization, and calibration. Hyperspectral Remote Sensing of the Land and Atmosphere, 4151:40–51, 2001.
  3. The proba/chris mission: A low-cost smallsat for hyperspectral multiangle observations of the earth surface and atmosphere. IEEE Transactions on Geoscience and Remote Sensing, 42(7):1512–1520, 2004.
  4. Hyperspectral imager for the coastal ocean: instrument description and first images. Applied optics, 50(11):1501–1516, 2011.
  5. The advanced hyperspectral imager: aboard china’s gaofen-5 satellite. IEEE Geoscience and Remote Sensing Magazine, 7(4):23–32, 2019.
  6. In-orbit demonstration of the first hyperspectral imager for nanosatellites. In International Conference on Space Optics—ICSO 2018, volume 11180, pages 760–770. SPIE, 2019.
  7. The prisma imaging spectroscopy mission: Overview and first performance analysis. Remote sensing of environment, 262:112499, 2021.
  8. The ghgsat-d imaging spectrometer. Atmospheric Measurement Techniques, 14(3):2127–2140, 2021.
  9. On-orbit performance of hyperspectral imager suite (hisui). In Sensors, Systems, and Next-Generation Satellites XXVI, volume 12264, pages 91–102. SPIE, 2022.
  10. Hypso-1 cubesat: first images and in-orbit characterization. Remote Sensing, 15(3):755, 2023.
  11. The enmap imaging spectroscopy mission towards operations. Remote Sensing of Environment, 294:113632, 2023.
  12. Chime’s hyperspectral imager (hsi): status of instrument design and performance at pdr. In International Conference on Space Optics—ICSO 2022, volume 12777, pages 1379–1398. SPIE, 2023.
  13. Synergies between nasa’s hyperspectral aquatic missions pace, glimr, and sbg: Opportunities for new science and applications. Journal of Geophysical Research: Biogeosciences, 128(10):e2023JG007574, 2023.
  14. Inference in supervised spectral classifiers for on-board hyperspectral imaging: An overview. Remote Sensing, 12(3):534, 2020.
  15. Hyperspectral image classification and dimensionality reduction: An orthogonal subspace projection approach. IEEE Transactions on geoscience and remote sensing, 32(4):779–785, 1994.
  16. Estimation of number of spectrally distinct signal sources in hyperspectral imagery. IEEE Transactions on geoscience and remote sensing, 42(3):608–619, 2004.
  17. On the impact of pca dimension reduction for hyperspectral detection of difficult targets. IEEE Geoscience and Remote Sensing Letters, 2(2):192–195, 2005.
  18. Change detection in hyperspectral imagery using temporal principal components. In Algorithms and Technologies for Multispectral, Hyperspectral, and Ultraspectral Imagery XII, volume 6233, pages 368–377. SPIE, 2006.
  19. Hyperspectral image compression using jpeg2000 and principal component analysis. IEEE Geoscience and Remote sensing letters, 4(2):201–205, 2007.
  20. Locality-preserving dimensionality reduction and classification for hyperspectral image analysis. IEEE Transactions on Geoscience and Remote Sensing, 50(4):1185–1198, 2011.
  21. Anomaly detection and reconstruction from random projections. IEEE Transactions on Image Processing, 21(1):184–195, 2011.
  22. Unsupervised hierarchical spectral analysis for change detection in hyperspectral images. In 2012 4th Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing (WHISPERS), pages 1–4. IEEE, 2012.
  23. Autogad: An improved ica-based hyperspectral anomaly detection algorithm. IEEE Transactions on Geoscience and Remote Sensing, 51(6):3492–3503, 2012.
  24. Model-based fusion of multi-and hyperspectral images using pca and wavelets. IEEE transactions on Geoscience and Remote Sensing, 53(5):2652–2663, 2014.
  25. Spectral and spatial classification of hyperspectral images based on ica and reduced morphological attribute profiles. IEEE Transactions on Geoscience and Remote Sensing, 53(11):6223–6240, 2015.
  26. Random projection-based dimensionality reduction method for hyperspectral target detection. In Imaging Spectrometry XX, volume 9611, pages 256–262. SPIE, 2015.
  27. Minimum noise fraction versus principal component analysis as a preprocessing step for hyperspectral imagery denoising. Canadian Journal of Remote Sensing, 42(2):106–116, 2016.
  28. Innovative data reduction and visualization strategy for hyperspectral imaging datasets using t-sne approach. Pure and Applied Chemistry, 90(3):493–506, 2018.
  29. Self-organizing maps for clustering hyperspectral images on-board a cubesat. Remote Sensing, 13(20):4174, 2021.
  30. Component decomposition analysis for hyperspectral anomaly detection. IEEE Transactions on Geoscience and Remote Sensing, 60:1–22, 2021.
  31. Surface biology and geology imaging spectrometer: A case study to optimize the mission design using intrinsic dimensionality. Remote Sensing of Environment, 290:113534, 2023.
  32. Effects of linear projections on the performance of target detection and classification in hyperspectral imagery. Journal of Applied Remote Sensing, 5(1):053563–053563, 2011.
  33. Comparing four dimension reduction algorithms to classify algae concentration levels in water samples using hyperspectral imaging. Water, Air, & Soil Pollution, 227:1–12, 2016.
  34. The effect of dimensionality reduction on signature-based target detection for hyperspectral remote sensing. In CubeSats and SmallSats for remote sensing III, volume 11131, pages 164–182. SPIE, 2019.
  35. Feature extraction for hyperspectral image classification: A review. International Journal of Remote Sensing, 41(16):6248–6287, 2020.
  36. Feature extraction for hyperspectral imagery: The evolution from shallow to deep: Overview and toolbox. IEEE Geoscience and Remote Sensing Magazine, 8(4):60–88, 2020.
  37. Interpretable hyperspectral artificial intelligence: When nonconvex modeling meets hyperspectral remote sensing. IEEE Geoscience and Remote Sensing Magazine, 9(2):52–87, 2021.
  38. Dimensionality reduction and classification of hyperspectral remote sensing image feature extraction. Remote Sensing, 14(18):4579, 2022.
  39. Harold Hotelling. Analysis of a complex of statistical variables into principal components. Journal of educational psychology, 24(6):417, 1933.
  40. Principal component analysis. Wiley interdisciplinary reviews: computational statistics, 2(4):433–459, 2010.
  41. Principal component analysis: a review and recent developments. Philosophical transactions of the royal society A: Mathematical, Physical and Engineering Sciences, 374(2065):20150202, 2016.
  42. Christopher Gordon. A generalization of the maximum noise fraction transform. IEEE Transactions on geoscience and remote sensing, 38(1):608–610, 2000.
  43. Hyperspectral subspace identification. IEEE Transactions on Geoscience and Remote Sensing, 46(8):2435–2445, 2008.
  44. Finding structure with randomness: Probabilistic algorithms for constructing approximate matrix decompositions. SIAM review, 53(2):217–288, 2011.
  45. On the nyström and column-sampling methods for the approximate principal components analysis of large datasets. Journal of Computational and Graphical Statistics, 25(2):344–362, 2016.
  46. Sparse principal component analysis. Journal of computational and graphical statistics, 15(2):265–286, 2006.
  47. Lateral-slice sparse tensor robust principal component analysis for hyperspectral image classification. IEEE Geoscience and Remote Sensing Letters, 17(1):107–111, 2019.
  48. Hyperspectral image denoising via nonlocal spectral sparse subspace representation. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2023.
  49. Reversible integer klt for progressive-to-lossless compression of multiple component images. In Proceedings 2003 international conference on image processing (Cat. No. 03CH37429), volume 1, pages I–633. IEEE, 2003.
  50. Transform coding techniques for lossy hyperspectral data compression. IEEE Transactions on Geoscience and Remote Sensing, 45(5):1408–1421, 2007.
  51. On-the-fly processing of continuous high-dimensional data streams. Chemometrics and Intelligent Laboratory Systems, 161:118–129, 2017.
  52. Evaluating visible derivative spectroscopy by varimax-rotated, principal component analysis of aerial hyperspectral images from the western basin of lake erie. Journal of great lakes research, 45(3):522–535, 2019.
  53. Superpca: A superpixelwise pca approach for unsupervised feature extraction of hyperspectral imagery. IEEE Transactions on Geoscience and Remote Sensing, 56(8):4581–4593, 2018.
  54. Limitations of principal components analysis for hyperspectral target recognition. IEEE Geoscience and Remote Sensing Letters, 5(4):625–629, 2008.
  55. Frequent directions: Simple and deterministic matrix sketching. SIAM Journal on Computing, 45(5):1762–1792, 2016.
  56. A study on the effectiveness of different independent component analysis algorithms for hyperspectral image classification. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 7(6):2183–2199, 2014.
  57. Least-dependent-component analysis based on mutual information. Physical Review E, 70(6):066123, 2004.
  58. Principal skewness analysis: Algorithm and its application for multispectral/hyperspectral images indexing. IEEE Geoscience and Remote Sensing Letters, 11(10):1821–1825, 2014.
  59. Does independent component analysis play a role in unmixing hyperspectral data? IEEE Transactions on Geoscience and Remote Sensing, 43(1):175–187, 2005.
  60. Independent component analysis-based dimensionality reduction with applications in hyperspectral image analysis. IEEE transactions on geoscience and remote sensing, 44(6):1586–1600, 2006.
  61. C-I Chang. Applications of independent component analysis in endmember extraction and abundance quantification for hyperspectral imagery. IEEE Transactions on Geoscience and Remote Sensing, 44(9):2601–2616, 2006.
  62. Classification of hyperspectral images by using extended morphological attribute profiles and independent component analysis. IEEE Geoscience and Remote Sensing Letters, 8(3):542–546, 2010.
  63. Deep deterministic independent component analysis for hyperspectral unmixing. In ICASSP 2022-2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pages 3878–3882. IEEE, 2022.
  64. Hyperspectral image classification with independent component discriminant analysis. IEEE transactions on Geoscience and remote sensing, 49(12):4865–4876, 2011.
  65. Faster independent component analysis by preconditioning with Hessian approximations. IEEE Transactions on Signal Processing, 66(15):4040–4049, 2018.
  66. Stochastic higher-order independent component analysis for hyperspectral dimensionality reduction. IEEE Transactions on Computational Imaging, 8:1184–1194, 2022.
  67. Convergence analysis of stochastic higher-order majorization–minimization algorithms. Optimization Methods and Software, pages 1–30, 2023.
  68. Erkki Oja Aapo Hyvärinen, Juha Karhunen. Independent component analysis. John Wiley & Sons, 2001.
  69. Aapo Hyvarinen. Fast and robust fixed-point algorithms for independent component analysis. IEEE transactions on Neural Networks, 10(3):626–634, 1999.
  70. How fast is fastica? In 2006 14th European Signal Processing Conference, pages 1–5, 2006.
  71. Automatic spectral target recognition in hyperspectral imagery. IEEE Transactions on Aerospace and Electronic Systems, 39(4):1232–1249, 2003.
  72. Bi-endmember semi-nmf based on low-rank and sparse matrix decomposition. IEEE Transactions on Geoscience and Remote Sensing, 60:1–16, 2022.
  73. Comparative study and analysis among atgp, vca, and sga for finding endmembers in hyperspectral imagery. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 9(9):4280–4306, 2016.
  74. Chein-I Chang. A review of virtual dimensionality for hyperspectral imagery. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 11(4):1285–1305, 2018.
  75. Fast orthogonal projection for hyperspectral unmixing. IEEE Transactions on Geoscience and Remote Sensing, 60:1–13, 2022.
  76. Chein-I Chang. Orthogonal subspace projection (osp) revisited: A comprehensive study and analysis. IEEE transactions on geoscience and remote sensing, 43(3):502–518, 2005.
  77. Locality preserving projections. Advances in neural information processing systems, 16, 2003.
  78. Graph embedding and extensions: A general framework for dimensionality reduction. IEEE transactions on pattern analysis and machine intelligence, 29(1):40–51, 2006.
  79. Unsupervised dimensionality reduction for hyperspectral imagery via laplacian regularized collaborative representation projection. IEEE Geoscience and Remote Sensing Letters, 19:1–5, 2022.
  80. Unsupervised dimensionality reduction with multi-feature structure joint preserving embedding for hyperspectral imagery. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2023.
  81. Manifold-learning-based feature extraction for classification of hyperspectral data: A review of advances in manifold learning. IEEE Signal Processing Magazine, 31(1):55–66, 2013.
  82. A modified locality-preserving projection approach for hyperspectral image classification. IEEE Geoscience and Remote Sensing Letters, 13(8):1059–1063, 2016.
  83. Discriminant analysis-based dimension reduction for hyperspectral image classification: A survey of the most recent advances and an experimental comparison of different techniques. IEEE Geoscience and Remote Sensing Magazine, 6(1):15–34, 2018.
  84. Very sparse random projections. In Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining, pages 287–296, 2006.
  85. Dimitris Achlioptas. Database-friendly random projections: Johnson-lindenstrauss with binary coins. Journal of computer and System Sciences, 66(4):671–687, 2003.
  86. Hyperspectral blind reconstruction from random spectral projections. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 9(6):2390–2399, 2016.
  87. James E Fowler. Compressive-projection principal component analysis. IEEE transactions on image processing, 18(10):2230–2242, 2009.
  88. Random-projection-based dimensionality reduction and decision fusion for hyperspectral target detection. In 2011 IEEE International Geoscience and Remote Sensing Symposium, pages 1790–1793. IEEE, 2011.
  89. Fast svd with random hadamard projection for hyperspectral dimensionality reduction. IEEE Geoscience and Remote Sensing Letters, 13(9):1275–1279, 2016.
  90. Random projections for scaling machine learning on fpgas. In 2016 International Conference on Field-Programmable Technology (FPT), pages 85–92. IEEE, 2016.
  91. Random hadamard projections for hyperspectral unmixing. IEEE Geoscience and Remote Sensing Letters, 14(3):419–423, 2017.
  92. Learning the parts of objects by non-negative matrix factorization. Nature, 401(6755):788–791, 1999.
  93. Nicolas Gillis. The why and how of nonnegative matrix factorization. In Regularization, Optimization, Kernels, and Support Vector Machines, pages 257–291. Chapman & Hall/CRC Boca Raton, 2014.
  94. Hyperspectral unmixing based on nonnegative matrix factorization: A comprehensive review. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2022.
  95. Nonconvex optimization meets low-rank matrix factorization: An overview. IEEE Transactions on Signal Processing, 67(20):5239–5269, 2019.
  96. Multi-block bregman proximal alternating linearized minimization and its application to orthogonal nonnegative matrix factorization. Computational Optimization and Applications, 79(3):681–715, 2021.
  97. Coordinate projected gradient descent minimization and its application to orthogonal nonnegative matrix factorization. In 2022 IEEE 61st Conference on Decision and Control (CDC), pages 6929–6934. IEEE, 2022.
  98. Algorithms for non-negative matrix factorization. Advances in neural information processing systems, 13, 2000.
  99. A fast learning algorithm for deep belief nets. Neural computation, 18(7):1527–1554, 2006.
  100. Spectral–spatial classification of hyperspectral data based on deep belief network. IEEE journal of selected topics in applied earth observations and remote sensing, 8(6):2381–2392, 2015.
  101. An unsupervised deep hyperspectral anomaly detector. Sensors, 18(3), 2018.
  102. High-level fpga design of deep learning hyperspectral anomaly detection. In 2023 IEEE Nordic Circuits and Systems Conference (NorCAS), pages 1–5. IEEE, 2023.
  103. Hyperspectral classification onboard the hypso-1 cubesat. In Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing, 2023.
  104. Detection algorithms in hyperspectral imaging systems: An overview of practical algorithms. IEEE Signal Processing Magazine, 31(1):24–33, 2013.
  105. Deep learning-based classification of hyperspectral data. IEEE Journal of Selected topics in applied earth observations and remote sensing, 7(6):2094–2107, 2014.
  106. Scipy 1.0: fundamental algorithms for scientific computing in python. Nature methods, 17(3):261–272, 2020.
  107. Estimating mutual information. Physical review E, 69(6):066138, 2004.
  108. Demystifying fixed k𝑘kitalic_k-nearest neighbor information estimators. IEEE Transactions on Information Theory, 64(8):5629–5661, 2018.
  109. Torbjørn Skauli. Feasibility of a standard for full specification of spectral imager performance. In Hyperspectral Imaging Sensors: Innovative Applications and Sensor Standards 2017, volume 10213, pages 33–44. SPIE, 2017.
  110. De-striping hyperspectral imagery using wavelet transform and adaptive frequency domain filtering. ISPRS journal of photogrammetry and remote sensing, 66(5):620–636, 2011.
  111. Anisotropic spectral-spatial total variation model for multispectral remote sensing image destriping. IEEE Transactions on Image Processing, 24(6):1852–1866, 2015.
  112. Reduction of uncorrelated striping noise—applications for hyperspectral pushbroom acquisitions. Remote Sensing, 6(11):11082–11106, 2014.

Summary

We haven't generated a summary for this paper yet.