1,048
Views
1
CrossRef citations to date
0
Altmetric
Biomedical Engineering

Improvement of chest X-ray image segmentation accuracy based on FCA-Net

ORCID Icon, , ORCID Icon, ORCID Icon, ORCID Icon & ORCID Icon
Article: 2229571 | Received 25 May 2023, Accepted 19 Jun 2023, Published online: 25 Jun 2023
 

Abstract

Medical image segmentation is a crucial stage in computer vision and image processing to help the later-stage diagnosis process become more accurate. Because medical image segmentation, such as X-ray, can extract tissue, organs, and pathological structures. However, medical image processing, primarily in the segmentation process, has significant challenges regarding feature representation. Because medical images have different characteristics than other images related to contrast, blur, and noise. This study proposes the use of lung segmentation on chest X-ray images based on deep learning with the FCA-Net (Fully Convolutional Attention Network) architecture. In addition, attention modules, namely spatial attention and channel attention, are added to the Res2Net encoder so that it is expected to be able to represent features better. This research was conducted on chest X-ray images from Qatar University contained in the Kaggle repository. A chest x-ray image measuring 256 × 256 pixels and as many as 1500 images were then divided into 10% testing data and 90% training data. The training data will then be processed in K-Fold Cross validation from K = 2 until K = 10. The experiment was conducted with scenarios that used spatial attention, channel attention, and a combination of spatial and channel attention. The best test results in this study were using a variety of spatial attention and channel attention in the division of K-Fold with a value of K = 5 with a DSC (Dice Similarity Coefficient) value in the testing data of 97.24% and IoU (Intersection over Union) in the testing data of 94.66%. This accuracy result is better than the UNet++, DeepLabV3+, and SegNet architectures.

Acknowledgments

This research project is supported by a grant from Research and Innovation for Advanced Indonesia, National Research and Innovation Agency with contract numbers: 52/IV/KS/06/2022 and 2988/UN46.4.1/PT/00.03/2022. Besides that, it is also supported by datasets from Airlangga University Hospital (RSUA) Indonesia.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

The work was supported by the Research and Innovation for Advanced Indonesia, National Research and Innovation Agency [52/IV/KS/06/2022 and 2988/UN46.4.1/PT/00.03/2022].