Low-Rank Prompt-Guided Transformer for Hyperspectral Image Denoising

  • Xiaodong Tan
  • , Mingwen Shao
  • , Yuanjian Qiao
  • , Tiyao Liu
  • , Xiangyong Cao

Research output: Contribution to journalArticlepeer-review

15 Scopus citations

Abstract

Hyperspectral image (HSI) denoising is an essential preprocessing step for downstream applications. Although vision transformer (ViT)-based approaches show impressive denoising performance through self-similarity modeling, these methods still fail to exploit spatial and spectral correlations while ensuring flexibility and efficacy. To address this issue, we propose a hyperspectral denoising transformer using low-rank prompt (HyLoRa), simultaneously taking the spatial self-similarity and spectral low-rank property into account for HSI denoising. Specifically, to fully utilize intrinsic similarity in spatial domain, we perform cross-shaped window-based spatial self-attention for effectively modeling local and global similarity. Moreover, to exploit low-rank inductive bias, we integrate a low-rank prompt module into attention calculation for counting corrected low-dimensional vectors from a large collection of HSIs. This helps to better refine underlying noise-free structure representations. Compared to existing works, powerful capabilities for modeling spatial and spectral correlations can be built to correct low-rank representation in the feature space. Extensive experiments on both simulated and real remote sensing noise demonstrate that our HyLoRa consistently surpasses the state-of-the-art methods.

Original languageEnglish
Article number5520815
JournalIEEE Transactions on Geoscience and Remote Sensing
Volume62
DOIs
StatePublished - 2024

Keywords

  • Hyperspectral image (HSI) denoising
  • low-rank representation
  • prompt learning
  • transformer

Fingerprint

Dive into the research topics of 'Low-Rank Prompt-Guided Transformer for Hyperspectral Image Denoising'. Together they form a unique fingerprint.

Cite this