338
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Remote sensing image feature matching via graph classification with local motion consistency

, &
Article: 2308713 | Received 07 Sep 2023, Accepted 17 Jan 2024, Published online: 06 Feb 2024
 

ABSTRACT

Feature matching is a classic challenge in the computer vision field. In this paper, we propose an innovative graph classification method based on neighborhood motion consistency to eliminate erroneous matches. Specifically, we transform the coordinates of feature matching points into vectors on a unified scale. For a given match, we construct a graph centered around the match and incorporating neighboring matches. Node attributes are designed to represent the similarity between the vector of the central node and those of its neighbors. To facilitate this, we develop a lightweight graph attention neural network dedicated to graph property classification, thereby predicting the accuracy of the match under consideration. To effectively train the model, we employ a random cropping strategy to generate a plethora of diverse graphs for classifier training. We evaluate our method on datasets encompassing translational remote sensing data, rotational and scaled remote sensing imagery produced via random cropping, and nonrigid fisheye datasets. Our algorithm demonstrates superior performance to current state-of-the-art methods.

Acknowledgments

We sincerely thank the authors of OANet, GMS, LPM, LMR, and mTopKRP for providing their algorithm codes, which facilitated the comparative experiments.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Data availability statement

The code, trained weights and experimental data that support the findings of this study are openly available in GitHub at https://github.com/codingbadbad/RS-Image-Feature-Matching-via-GCwith-LMC. The test dataset is publicly available in figshare at https://figshare.com/articles/dataset/RS_Image_Feature_Matching_via_GCwith_LMC/23995008.

Additional information

Funding

This work was supported by Key scientific and technological innovation projects of Fujian Province: [grant number 2022G02008]; The Education and Scientific Research Project of Fujian Provincial Department of Finance: [grant number KY030346].