242
Views
0
CrossRef citations to date
0
Altmetric
Research Article

A Deep Learning Model with Axial Attention for Radar Echo Extrapolation

, &
Article: 2311003 | Received 16 Jan 2023, Accepted 23 Jan 2024, Published online: 03 May 2024
 

ABSTRACT

Radar echo extrapolation is an important approach in precipitation nowcasting which utilizes historical radar echo images to predict future echo images. In this paper, we introduce the self-attention mechanism into Trajectory Gated Recurrent Unit (TrajGRU) model. Under the sequence-to-sequence framework, we have developed a novel convolutional recurrent neural network called Self-attention Trajectory Gated Recurrent Unit (SA-TrajGRU), which incorporates the self-attention mechanism. The SA-TrajGRU model which combines spatiotemporal variant structure in TrajGRU and self-attention module is simple and effective. We evaluate our approach on the Moving MNIST-2 dataset and the CIKM AnalytiCup 2017 radar echo dataset. The experimental results show that the performance of the proposed SA-TrajGRU model is comparable to other convolutional recurrent neural network models. HSS and CSI scores of the SA-TrajGRU model are higher than scores of other models under the radar echo threshold of 25 dBZ, indicating that the SA-TrajGRU model has the most accurate prediction results under this threshold.

Acknowledgements

We acknowledge Pro. Xiangfeng Guan for providing English proofreading and valuable suggestion.

Data availability statement

The data that support the findings of this study are available the Tianchi Laboratory website on Alibaba Cloud. Data are available at https://tianchi.aliyun.com/dataset/1085 with the permission of Alibaba.

Disclosure statement

No potential conflict of interest was reported by the authors.

Supplementary material

Supplemental data for this article can be accessed online at https://doi.org/10.1080/08839514.2024.2311003.

Additional information

Funding

This work was supported by the Natural Science Foundation of Fujian Province of China (Nos.2021J011225, 2023J011099).