Context-Based Multiscale Unified Network for Missing Data Reconstruction in Remote Sensing Images

  • Mingwen Shao
  • , Chao Wang
  • , Tianjun Wu
  • , Deyu Meng
  • , Jiancheng Luo

Research output: Contribution to journalArticlepeer-review

16 Scopus citations

Abstract

Missing data reconstruction is a classical yet challenging problem in remote sensing images. Most current methods based on traditional convolutional neural network require supplementary data and can only handle one specific task. To address these limitations, we propose a novel generative adversarial network-based missing data reconstruction method in this letter, which is capable of various reconstruction tasks given only single source data as input. Two auxiliary patch-based discriminators are deployed to impose additional constraints on the local and global regions, respectively. In order to better fit the nature of remote sensing images, we introduce special convolutions and attention mechanism in a two-stage generator, thereby benefiting the tradeoff between accuracy and efficiency. Combining with perceptual and multiscale adversarial losses, the proposed model can produce coherent structure with better details. Qualitative and quantitative experiments demonstrate the uncompromising performance of the proposed model against multisource methods in generating visually plausible reconstruction results. Moreover, further exploration shows a promising way for the proposed model to utilize spatio-spectral-temporal information. The codes and models are available at https://github.com/Oliiveralien/Inpainting-on-RSI.

Original languageEnglish
JournalIEEE Geoscience and Remote Sensing Letters
Volume19
DOIs
StatePublished - 2022

Keywords

  • Context aware
  • generative adversarial network (GAN)
  • image reconstruction
  • multiscale
  • remote sensing images

Fingerprint

Dive into the research topics of 'Context-Based Multiscale Unified Network for Missing Data Reconstruction in Remote Sensing Images'. Together they form a unique fingerprint.

Cite this