主催: 電気・情報関係学会九州支部連合大会委員会
共催: 佐賀大学
会議名: 2021年度電気・情報関係学会九州支部連合大会
回次: 74
開催地: オンライン開催(大会本部:佐賀大学本庄キャンパス)
開催日: 2021/09/24 - 2021/09/25
In the field of NLP, there is a Transformer model that performs better than LSTM. We made the Transformer model used for time series prediction, which can obtain global information through attention mechanism. We use Shirakawa’s flood data to train and test the model, which can confirm that Transformer model has better predictive capabilities.