531
Views
0
CrossRef citations to date
0
Altmetric
Research Article

CMPF-UNet: a ConvNeXt multi-scale pyramid fusion U-shaped network for multi-category segmentation of remote sensing images

, &
Article: 2311217 | Received 11 Oct 2023, Accepted 23 Jan 2024, Published online: 14 Feb 2024
 

Abstract

Most U-shaped convolutional neural network (CNN) methods have the problems of insufficient feature extraction and fail to fully utilize global/multi-scale context information, which makes it difficult to distinguish similar objects and shadow occluded objects in remote sensing images. This article proposes a ConvNeXt multi-scale pyramid fusion U-shaped network (CMPF-UNet). In this work, we first propose a novel backbone network based on ConvNeXt to enhance image feature extraction, and use ConvNeXt bottleneck blocks to reconstruct the decoder. Furthermore, a scale aware pyramid fusion (SAPF) module and Residual Atrous Spatial Pyramid Pooling (RASPP) module are proposed to dynamically fuse the rich multi-scale context information in advanced features. Finally, multiple Global Pyramid Guidance (GPG) modules are embedded in the network, aiming to provide different levels of global context information for the decoder by reconstructing skip-connections. Experiments on the Vaihingen and Potsdam datasets indicate that the proposed CMPF-UNet segmentation achieves more accurate results.

Data availability statement

The data provided in this article can be obtained from this website (https://www.isprs.org/education/benchmarks/UrbanSemLab/2d-sem-label-vaihingen.aspx).

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This research was funded by the Jilin Province Science and Technology Development Plan (Grant No. YDZJ202301ZYTS285), the National Natural Science Foundation of China (No. 21606099), the Innovative and Entrepreneurial Talents Foundation of Jilin Province (No. 2023QN31) and the Natural Science Foundation of Jilin Province (No. YDZJ202301ZYTS157).