Journal of the Eastern Asia Society for Transportation Studies
Online ISSN : 1881-1124
ISSN-L : 1341-8521
I: Road Traffic Engineering
Traffic Flow Prediction of Expressway Bottlenecks by an Attention Long Short-Term Memory Model
Xingwei LIUKuniaki SASAKI
著者情報
ジャーナル フリー

2022 年 14 巻 p. 1989-2003

詳細
抄録

Deep learning methods have recently been applied to the prediction field with high accuracy. However, the processes involved in deep learning represent a kind of “black box” that can be difficult to interpret. In addition, the recently proposed attention mechanism provides both the same level of accuracy and interpretability as deep learning models. In this study, we propose a long short-term memory (LSTM) approach with an attention mechanism framework, which we call the attention LSTM model that completes a traffic flow sequence and learns the temporal features of the traffic network. Empirical results demonstrated that the proposed model performs better than classical ARIMA model, the random forest and a conventional LSTM model, both of which already produce good results. This higher level of performance is reflective of fewer errors, faster convergence, and more accurate prediction. Furthermore, the proposed model provides a possible explanation for the high accuracy of traffic flow prediction. The attention mechanism illustrates that in addition to the morning and evening peaks, some periods in the afternoon also have a greater impact on subsequent traffic flow, suggesting new possibilities for traffic flow prediction. This finding helps us to understand the trends and patterns of expressway traffic flow.

著者関連情報
© 2022 Eastern Asia Society for Transportation Studies
前の記事 次の記事
feedback
Top