KeyMemoryRNN: A Flexible Prediction Framework for Spatiotemporal Prediction Networks
Most previous recurrent neural networks for spatiotemporal prediction have difficulty in learning the long-term spatiotemporal correlations and capturing skip-frame correlations. The reason is that the recurrent neural networks update the memory states only using information from the previous time s...
Guardado en:
Autores principales: | , , |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
IEEE
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/2b32dae151a54f3baa0971aa0135b6a9 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
id |
oai:doaj.org-article:2b32dae151a54f3baa0971aa0135b6a9 |
---|---|
record_format |
dspace |
spelling |
oai:doaj.org-article:2b32dae151a54f3baa0971aa0135b6a92021-11-18T00:11:22ZKeyMemoryRNN: A Flexible Prediction Framework for Spatiotemporal Prediction Networks2169-353610.1109/ACCESS.2021.3114215https://doaj.org/article/2b32dae151a54f3baa0971aa0135b6a92021-01-01T00:00:00Zhttps://ieeexplore.ieee.org/document/9541390/https://doaj.org/toc/2169-3536Most previous recurrent neural networks for spatiotemporal prediction have difficulty in learning the long-term spatiotemporal correlations and capturing skip-frame correlations. The reason is that the recurrent neural networks update the memory states only using information from the previous time step node and the networks tend to suffer from gradient propagation difficulties. We propose a new framework, KeyMemoryRNN, which has two contributions. The first is that we propose the KeyTranslate Module to extract the most effective historical memory state named keyword state, and we propose the KeyMemory-LSTM which uses the keyword state to update the hidden state to capture the skip-frame correlation. In particular, KeyMemoryLSTM has two training stages. In the second stage, KeyMemoryLSTM adaptively skips the update of sometime step nodes to build a shorter memory information flow to alleviate the difficulty of gradient propagation to learn the long-term spatiotemporal correlations. The second is that both KeyTranslate Module and KeyMemoryLSTM are flexible additional modules, so we can apply them to most RNN-based prediction networks to build KeyMemoryRNN with different base network. The KeyMemoryRNN achieves the state-of-the-art on three spatiotemporal prediction tasks, and we provide ablation studies and memory analysis to verify the effectiveness of KeyMemoryRNN.Shengchun WangXiang LinHuijie ZhuIEEEarticleRecurrent neural networksspatiotemporal predictionprediction networkElectrical engineering. Electronics. Nuclear engineeringTK1-9971ENIEEE Access, Vol 9, Pp 147678-147691 (2021) |
institution |
DOAJ |
collection |
DOAJ |
language |
EN |
topic |
Recurrent neural networks spatiotemporal prediction prediction network Electrical engineering. Electronics. Nuclear engineering TK1-9971 |
spellingShingle |
Recurrent neural networks spatiotemporal prediction prediction network Electrical engineering. Electronics. Nuclear engineering TK1-9971 Shengchun Wang Xiang Lin Huijie Zhu KeyMemoryRNN: A Flexible Prediction Framework for Spatiotemporal Prediction Networks |
description |
Most previous recurrent neural networks for spatiotemporal prediction have difficulty in learning the long-term spatiotemporal correlations and capturing skip-frame correlations. The reason is that the recurrent neural networks update the memory states only using information from the previous time step node and the networks tend to suffer from gradient propagation difficulties. We propose a new framework, KeyMemoryRNN, which has two contributions. The first is that we propose the KeyTranslate Module to extract the most effective historical memory state named keyword state, and we propose the KeyMemory-LSTM which uses the keyword state to update the hidden state to capture the skip-frame correlation. In particular, KeyMemoryLSTM has two training stages. In the second stage, KeyMemoryLSTM adaptively skips the update of sometime step nodes to build a shorter memory information flow to alleviate the difficulty of gradient propagation to learn the long-term spatiotemporal correlations. The second is that both KeyTranslate Module and KeyMemoryLSTM are flexible additional modules, so we can apply them to most RNN-based prediction networks to build KeyMemoryRNN with different base network. The KeyMemoryRNN achieves the state-of-the-art on three spatiotemporal prediction tasks, and we provide ablation studies and memory analysis to verify the effectiveness of KeyMemoryRNN. |
format |
article |
author |
Shengchun Wang Xiang Lin Huijie Zhu |
author_facet |
Shengchun Wang Xiang Lin Huijie Zhu |
author_sort |
Shengchun Wang |
title |
KeyMemoryRNN: A Flexible Prediction Framework for Spatiotemporal Prediction Networks |
title_short |
KeyMemoryRNN: A Flexible Prediction Framework for Spatiotemporal Prediction Networks |
title_full |
KeyMemoryRNN: A Flexible Prediction Framework for Spatiotemporal Prediction Networks |
title_fullStr |
KeyMemoryRNN: A Flexible Prediction Framework for Spatiotemporal Prediction Networks |
title_full_unstemmed |
KeyMemoryRNN: A Flexible Prediction Framework for Spatiotemporal Prediction Networks |
title_sort |
keymemoryrnn: a flexible prediction framework for spatiotemporal prediction networks |
publisher |
IEEE |
publishDate |
2021 |
url |
https://doaj.org/article/2b32dae151a54f3baa0971aa0135b6a9 |
work_keys_str_mv |
AT shengchunwang keymemoryrnnaflexiblepredictionframeworkforspatiotemporalpredictionnetworks AT xianglin keymemoryrnnaflexiblepredictionframeworkforspatiotemporalpredictionnetworks AT huijiezhu keymemoryrnnaflexiblepredictionframeworkforspatiotemporalpredictionnetworks |
_version_ |
1718425151728517120 |