225
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Deep multi-modal fusion network with gated unit for breast cancer survival prediction

&
Pages 883-896 | Received 21 Dec 2022, Accepted 02 May 2023, Published online: 11 May 2023
 

Abstract

Accurate survival prediction is a critical goal in the prognosis of breast cancer patients because it can help physicians make more patient-friendly decisions and further guide appropriate treatment. Breast cancer is often caused by genetic abnormalities, which prompts researchers to consider information such as gene expression and copy number variation in addition to clinical data in their studies. The integration of these multi-modal data can improve the predictive power of models. However, with the highly unbalanced information of breast cancer patient data, it becomes a new challenge for breast cancer patient survival prediction to fully extract the characteristic information of these multi-modal data and to consider the complementarity of this information. To this end, we propose a deep multi-modal fusion network (DMMFN) to predict the five-year survival of breast cancer patients by integrating clinical data, copy number variation data, and gene expression data. The imbalanced dataset is first processed using the oversampling method SMOTE-NC. Then the abstract modal features of the multi-modal data are extracted by the two-layer one-dimensional convolutional neural network and the bi-directional long short-term memory network. Next, the weight coefficients of each modal data are dynamically adjusted using gated multimodal units to obtain fusion features. Finally, the fusion features are fed into the MaxoutMLP classifier to obtain the final prediction results. We conducted experiments on the METABRIC dataset to verify the validity of the multi-modal data and compared it with other methods. The comprehensive performance evaluation shows that DMMFN has better prediction performance.

Disclosure statement

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Additional information

Funding

This work was supported by the Scientific Research Plan Projects of Education Department of Jiangxi Province of China under the Grant No. GJJ160554, the Talent Plan Project of Fuzhou City of Jiangxi Province of China under the Grant No. 2021ED008, and the Opening Project of Jiangxi Key Laboratory of Cybersecurity Intelligent Perception under the Grant No. JKLCIP202202.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.