Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
149 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Modeling the spatio-temporal dynamics of land use change with recurrent neural networks (1803.10915v2)

Published 29 Mar 2018 in stat.AP

Abstract: This study applies recurrent neural networks (RNNs), which are known for its ability to process sequential information, to model the spatio-temporal dynamics of land use change (LUC) and to forecast annual land use maps of the city of Tsukuba, Japan. We develop two categories of RNN models: 1) simple RNN, which is the basic RNN variant, 2) three RNN variants with advanced gated architecture: long short-term memory (LSTM), LSTM with peephole connection (LSTM-peephole), and gated recurrent unit (GRU) models. The four models are developed using spatio-temporal data with high temporal resolution, annual data for the periods 2000 to 2010, 2011 and 2012 to 2016 are used for training, validation and testing, respectively. The predictive performances are evaluated using classification metrics (accuracy and F1 score) and the map comparison metrics (Kappa simulation and fuzzy Kappa simulation). The results show that all RNN models achieve F1 scores higher than 0.55, and Kappa simulations higher than 0.47. Out of the four RNN models, LSTM and LSTM-peephole models significantly outperform the other two RNN models. Furthermore, LSTM-peephole model slightly outperforms the LSTM model. In addition, the results indicate that the RNN models with gated architecture, which have better ability to model longer temporal dependency, significantly outperform the simple RNN model. Moreover, the predictive performance of LSTM-peephole model gradually decreases with the decrease of temporal sequential length of the training set. These results demonstrate the benefit of taking temporal dependency into account to model the LUC process with RNNs.

Citations (2)

Summary

We haven't generated a summary for this paper yet.