119
Views
0
CrossRef citations to date
0
Altmetric
Research Letter

SGMFNet: a remote sensing image object detection network based on spatial global attention and multi-scale feature fusion

&
Pages 466-477 | Received 12 Aug 2023, Accepted 07 Mar 2024, Published online: 03 Apr 2024
 

ABSTRACT

When natural image detection methods are applied to remote sensing images, their detection performance is often unsatisfactory due to the random distribution of objects, complex backgrounds, and significant scale changes. In order to better detect objects with complex backgrounds and significant scale changes in remote sensing images, this study presents SGMFNet, a remote sensing image object detection network based on spatial global attention (SGA) and multi-scale feature fusion (MFF). The SGA inserted into the backbone network can better model context information, suppress irrelevant background, and build powerful feature information, making it easier for subsequent MFF to extract scale-invariant information from adjacent feature layers. This study evaluates the performance of SGMFNet on remote sensing datasets DIOR, NWPU VHR-10, and RSD-GOD. Quantitative and qualitative results on three datasets demonstrate the superiority of SGMFNet in remote sensing object detection and its outperformance compared with other state-of-the-art methods. Therefore, SGMFNet can assist in high-precision urban planning, military monitoring, and other tasks.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Data availability statement

The dataset in our study is based on the DIOR dataset (https://opendatalab.com/DIOR), NWPU VHR-10 dataset (https://opendatalab.com/NWPU_VHR-10), and RSD-GOD dataset (https://github.com/ZhuangShuoH/geospatial-object-detection).

Additional information

Funding

This work was supported by the National Natural Science Foundation of China under Grant 62134004.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.