PRGS: Patch-to-Region Graph Search for Visual Place Recognition

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

Visual Place Recognition (VPR) is a task to estimate the target location based on visual information in changing scenarios, which usually uses a two-stage strategy of global retrieval and reranking. Existing reranking methods in VPR establish a single correspondence between the query image and the candidate images for reranking, which almost overlooks the neighbor correspondences in retrieved candidate images that can help to enhance reranking. In this paper, we propose a Patch-to-Region Graph Search (PRGS) method to enhance reranking using neighbor correspondences in candidate images. Firstly, considering that searching for neighbor correspondences relies on important features, we design a Patch-to-Region (PR) module, which aggregates patch level features into region level features for highlighting important features. Secondly, to estimate the candidate image reranking score using the neighbor correspondences, we design a Graph Search (GS) module, which establishes the neighbor correspondences among all candidates and query images in graph space. What is more, PRGS integrates well with both CNN and transformer backbone. We achieve competitive performance on several benchmarks, offering a 64% improvement in matching time and approximately 59% reduction in FLOPs compared to state-of-the-art methods. The code is released at https://github.com/LKELN/PRGS.

Original languageEnglish
Article number111673
JournalPattern Recognition
Volume166
DOIs
StatePublished - Oct 2025

Keywords

  • Graph Convolutional Network
  • Patch-to-Region
  • Visual Place Recognition

Fingerprint

Dive into the research topics of 'PRGS: Patch-to-Region Graph Search for Visual Place Recognition'. Together they form a unique fingerprint.

Cite this