Contour guided hierarchical model for shape matching

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

22 Scopus citations

Abstract

For its simplicity and effectiveness, star model is popular in shape matching. However, it suffers from the loose geometric connections among parts. In the paper, we present a novel algorithm that reconsiders these connections and reduces the global matching to a set of interrelated local matching. For the purpose, we divide the shape template into overlapped parts and model the matching through a part-based layered structure that uses the latent variable to constrain parts' deformation. As for inference, each part is used for localizing candidates by the partial matching. Thanks to the contour fragments, the partial matching can be solved via modified dynamic programming. The overlapped regions among parts of the template are then explored to make the candidates of parts meet at their shared points. The process is fulfilled via a refined procedure based on iterative dynamic programming. Results on ETHZ shape and Inria Horse datasets demonstrate the benefits of the proposed algorithm.

Original languageEnglish
Title of host publication2015 International Conference on Computer Vision, ICCV 2015
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1609-1617
Number of pages9
ISBN (Electronic)9781467383912
DOIs
StatePublished - 17 Feb 2015
Event15th IEEE International Conference on Computer Vision, ICCV 2015 - Santiago, Chile
Duration: 11 Dec 201518 Dec 2015

Publication series

NameProceedings of the IEEE International Conference on Computer Vision
Volume2015 International Conference on Computer Vision, ICCV 2015
ISSN (Print)1550-5499

Conference

Conference15th IEEE International Conference on Computer Vision, ICCV 2015
Country/TerritoryChile
CitySantiago
Period11/12/1518/12/15

Fingerprint

Dive into the research topics of 'Contour guided hierarchical model for shape matching'. Together they form a unique fingerprint.

Cite this