579
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Reconstructed NDVI and EVI datasets in China (ReVIChina) generated by a spatial-interannual reconstruction method

, , , &
Pages 4749-4768 | Received 21 Sep 2023, Accepted 08 Nov 2023, Published online: 15 Nov 2023
 

ABSTRACT

Remote sensing-based vegetation index (VI) data are significantly impacted by cloud contamination. Spatiotemporal reconstruction methods demonstrate higher accuracy than temporal reconstruction methods. However, the computing time and random access memory (RAM) consumption of these spatiotemporal reconstruction methods for large-scale reconstruction remains unclear. In this study, a method called spatial-interannual reconstruction (SIR) was proposed to reconstruct cloud-contaminated pixels in MODIS normalized difference VI (NDVI) and enhanced VI (EVI) data. SIR has four major advantages: (1) High accuracy. The average mean absolute error of SIR was 0.0338, which was 20.2% and 23.4% lower than that of two state-of-the-art spatiotemporal reconstruction methods (i.e. interpolation of the mean anomalies (IMA) and Gapfill). (2) High computing speed. The average computing time of SIR was 99.7% and 98.8% lower than IMA and Gapfill, respectively. (3) Low RAM consumption. (4) Simultaneous reconstruction of all invalid values. Reconstructed 250 m spatial resolution and 16-day composite NDVI and EVI datasets in China from 2000 to 2022 (written as ReVIChina) were developed based on the SIR method and MODIS MOD13Q1 data. Spatiotemporal analyses revealed that the reconstructed datasets were more reliable than the original product and a similar dataset.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Data availability statement

MOD13Q1 NDVI and EVI data are accessible at: https://ladsweb.modaps.eosdis.nasa.gov/.

Additional information

Funding

This work was financially supported by the National Natural Science Foundation of China (Grants No. 41971295 and 42271328), and the Special Fund of Hubei Luojia Laboratory (grant number 220100031).