328
Views
0
CrossRef citations to date
0
Altmetric
Research Article

TR2RM: an urban road network generation model based on multisource big data

, , , &
Article: 2344596 | Received 04 Dec 2023, Accepted 14 Apr 2024, Published online: 24 Apr 2024
 

ABSTRACT

Road networks are an important part of transportation infrastructure through which people experience a city. The existing methods of vector map data generation mainly depend on a single data source, e.g. images, trajectories, or existing raster maps, which are limited by information fragmentation due to incomplete data. This study proposes an urban road network extraction framework named trajectory and remote-sensing image to RoadMap (TR2RM) based on deep learning technology by combining high-resolution remote sensing images with big trajectory data; this framework is composed of three components. The first component focuses on feature map generation by fusing remote sensing images with trajectories. The second component is composed of a novel neural network architecture denoted as AD-LinkNet, which is used to identify roads from the fused dataset of the first component. The last component is a postprocessing step that aims to generate the vector map accurately. Taking Rome, Beijing, and Wuhan as examples, we conduct extensive experiments to verify the effectiveness of the TR2RM. The results showed that the correctness of both the topology and geometry of the generated road network based on the TR2RM in Rome, Beijing, and Wuhan was 83.86% and 88.27%, 74.72% and 80.36%, and 73.83% and 77.7%, respectively.

Acknowledgments

The authors would like to sincerely thank the anonymous reviewers for their constructive comments and valuable suggestions to improve the quality of this article. The authors sincerely thank all the organizations and scholars who provided the data and technical support for this paper.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Data availability statement

The data and code that support the findings of this study are openly available at https://figshare.com/s/1779f7508e2c5974f792.

Additional information

Funding

This work was jointly supported by the National Natural Science Foundation of China [grant number: 42271449].