ReGO: Reference-Guided Outpainting for Scenery Image

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

We present ReGO (Reference-Guided Outpainting), a new method for the task of sketch-guided image outpainting. Despite the significant progress made in producing semantically coherent content, existing outpainting methods often fail to deliver visually appealing results due to blurry textures and generative artifacts. To address these issues, ReGO leverages neighboring reference images to synthesize texture-rich results by transferring pixels from them. Specifically, an Adaptive Content Selection (ACS) module is incorporated into ReGO to facilitate pixel transfer for texture compensating of the target image. Additionally, a style ranking loss is introduced to maintain consistency in terms of style while preventing the generated part from being influenced by the reference images. ReGO is a model-agnostic learning paradigm for outpainting tasks. In our experiments, we integrate ReGO with three state-of-the-art outpainting models to evaluate its effectiveness. The results obtained on three scenery benchmarks, i.e. NS6K, NS8K and SUN Attribute, demonstrate the superior performance of ReGO compared to prior art in terms of texture richness and authenticity. Our code is available at https://github.com/wangyxxjtu/ReGO-Pytorch.

Original languageEnglish
Pages (from-to)1375-1388
Number of pages14
JournalIEEE Transactions on Image Processing
Volume33
DOIs
StatePublished - 2024

Keywords

  • GAN
  • Image outpainting
  • adversarial learning
  • generation model

Fingerprint

Dive into the research topics of 'ReGO: Reference-Guided Outpainting for Scenery Image'. Together they form a unique fingerprint.

Cite this