Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Transformer based Endmember Fusion with Spatial Context for Hyperspectral Unmixing (2402.03835v3)

Published 6 Feb 2024 in eess.IV

Abstract: In recent years, transformer-based deep learning networks have gained popularity in Hyperspectral (HS) unmixing applications due to their superior performance. The attention mechanism within transformers facilitates input-dependent weighting and enhances contextual awareness during training. Drawing inspiration from this, we propose a novel attention-based Hyperspectral Unmixing algorithm called Transformer-based Endmember Fusion with Spatial Context for Hyperspectral Unmixing (FusionNet). This network leverages an ensemble of endmembers for initial guidance, effectively addressing the issue of relying on a single initialization. This approach helps avoid suboptimal results that many algorithms encounter due to their dependence on a singular starting point. The FusionNet incorporates a Pixel Contextualizer (PC), introducing contextual awareness into abundance prediction by considering neighborhood pixels. Unlike Convolutional Neural Networks (CNNs) and traditional Transformer-based approaches, which are constrained by specific kernel or window shapes, the Fusion network offers flexibility in choosing any arbitrary configuration of the neighborhood. We conducted a comparative analysis between the FusionNet algorithm and eight state-of-the-art algorithms using three widely recognized real datasets and one synthetic dataset. The results demonstrate that FusionNet offers competitive performance compared to the other algorithms.

Citations (1)

Summary

We haven't generated a summary for this paper yet.