Show simple item record

dc.contributor.authorFeng, Huihangen_US
dc.contributor.authorLiu, Lupengen_US
dc.contributor.authorXiao, Junen_US
dc.contributor.editorChaine, Raphaëlleen_US
dc.contributor.editorDeng, Zhigangen_US
dc.contributor.editorKim, Min H.en_US
dc.date.accessioned2023-10-09T07:42:56Z
dc.date.available2023-10-09T07:42:56Z
dc.date.issued2023
dc.identifier.isbn978-3-03868-234-9
dc.identifier.urihttps://doi.org/10.2312/pg.20231285
dc.identifier.urihttps://diglib.eg.org:443/handle/10.2312/pg20231285
dc.description.abstractThis paper presents a progressive graph matching network shorted as PGMNet. The method is more explainable and can match features from easy to hard. PGMNet contains two major blocks: sinkformers module and guided attention module. First, we use sinkformers to get the similar matrix which can be seen as an assignment matrix between two sets of feature keypoints. Matches with highest scores in both rows and columns are selected as pre-matched correspondences. These pre-matched matches can be leveraged to guide the update and matching of ambiguous features. The matching quality can be progressively improved as the the transformer blocks go deeper as visualized in Figure 1. Experiments show that our method achieves better results with typical attention-based methods.en_US
dc.publisherThe Eurographics Associationen_US
dc.rightsAttribution 4.0 International License
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.subjectCCS Concepts: Computing methodologies -> Matching; Mixed / augmented reality
dc.subjectComputing methodologies
dc.subjectMatching
dc.subjectMixed / augmented reality
dc.titleProgressive Graph Matching Network for Correspondencesen_US
dc.description.seriesinformationPacific Graphics Short Papers and Posters
dc.description.sectionheadersPosters
dc.identifier.doi10.2312/pg.20231285
dc.identifier.pages119-120
dc.identifier.pages2 pages


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

Attribution 4.0 International License
Except where otherwise noted, this item's license is described as Attribution 4.0 International License