An attention-based deep learning model for multi-horizon time series forecasting by considering periodic characteristic

Recently, transformer-based models have exhibited great performance in multi-horizon time series forecasting tasks. However, the core module of these models, the self-attention mechanism, is insensitive to the temporal order and suffers from attention dispersion over long time sequences. These limit...

Full description

Saved in:
Bibliographic Details
Main Authors: Fang, Jin, Guo, Xin, Liu, Yujia, Chang, Xiaokun, Fujita, Hamido, Wu, Jian
Format: Article
Published: Elsevier Ltd 2023
Subjects:
Online Access:http://eprints.utm.my/106425/
http://dx.doi.org/10.1016/j.cie.2023.109667
Tags: Add Tag
No Tags, Be the first to tag this record!
id my.utm.106425
record_format eprints
spelling my.utm.1064252024-06-30T06:08:26Z http://eprints.utm.my/106425/ An attention-based deep learning model for multi-horizon time series forecasting by considering periodic characteristic Fang, Jin Guo, Xin Liu, Yujia Chang, Xiaokun Fujita, Hamido Wu, Jian T Technology (General) Recently, transformer-based models have exhibited great performance in multi-horizon time series forecasting tasks. However, the core module of these models, the self-attention mechanism, is insensitive to the temporal order and suffers from attention dispersion over long time sequences. These limitations hinder the models from fully leveraging the features of time series data, particularly the periodicity. Furthermore, the lack of consideration for temporal order also hinders the identification of important temporal variables in transformers. To resolve these problems, this article develops an attention based deep learning model that can better utilize periodicity to improve prediction accuracy and enhance interpretability. We design a parallel skip LSTM module and a periodicity information utilization module to reinforce the connection between corresponding time steps within different periods and solve the problem of excessively sparse attention. An improved variable selection mechanism is embedded to the parallel skip LSTM such that temporal information can be considered when analyzing interpretability. The experimental findings on different types of real datasets show that the proposed model outperforms numerous baseline models in terms of prediction accuracy while obtaining certain interpretability. Elsevier Ltd 2023 Article PeerReviewed Fang, Jin and Guo, Xin and Liu, Yujia and Chang, Xiaokun and Fujita, Hamido and Wu, Jian (2023) An attention-based deep learning model for multi-horizon time series forecasting by considering periodic characteristic. Computers and Industrial Engineering, 185 (NA). NA-NA. ISSN 0360-8352 http://dx.doi.org/10.1016/j.cie.2023.109667 DOI : 10.1016/j.cie.2023.109667
institution Universiti Teknologi Malaysia
building UTM Library
collection Institutional Repository
continent Asia
country Malaysia
content_provider Universiti Teknologi Malaysia
content_source UTM Institutional Repository
url_provider http://eprints.utm.my/
topic T Technology (General)
spellingShingle T Technology (General)
Fang, Jin
Guo, Xin
Liu, Yujia
Chang, Xiaokun
Fujita, Hamido
Wu, Jian
An attention-based deep learning model for multi-horizon time series forecasting by considering periodic characteristic
description Recently, transformer-based models have exhibited great performance in multi-horizon time series forecasting tasks. However, the core module of these models, the self-attention mechanism, is insensitive to the temporal order and suffers from attention dispersion over long time sequences. These limitations hinder the models from fully leveraging the features of time series data, particularly the periodicity. Furthermore, the lack of consideration for temporal order also hinders the identification of important temporal variables in transformers. To resolve these problems, this article develops an attention based deep learning model that can better utilize periodicity to improve prediction accuracy and enhance interpretability. We design a parallel skip LSTM module and a periodicity information utilization module to reinforce the connection between corresponding time steps within different periods and solve the problem of excessively sparse attention. An improved variable selection mechanism is embedded to the parallel skip LSTM such that temporal information can be considered when analyzing interpretability. The experimental findings on different types of real datasets show that the proposed model outperforms numerous baseline models in terms of prediction accuracy while obtaining certain interpretability.
format Article
author Fang, Jin
Guo, Xin
Liu, Yujia
Chang, Xiaokun
Fujita, Hamido
Wu, Jian
author_facet Fang, Jin
Guo, Xin
Liu, Yujia
Chang, Xiaokun
Fujita, Hamido
Wu, Jian
author_sort Fang, Jin
title An attention-based deep learning model for multi-horizon time series forecasting by considering periodic characteristic
title_short An attention-based deep learning model for multi-horizon time series forecasting by considering periodic characteristic
title_full An attention-based deep learning model for multi-horizon time series forecasting by considering periodic characteristic
title_fullStr An attention-based deep learning model for multi-horizon time series forecasting by considering periodic characteristic
title_full_unstemmed An attention-based deep learning model for multi-horizon time series forecasting by considering periodic characteristic
title_sort attention-based deep learning model for multi-horizon time series forecasting by considering periodic characteristic
publisher Elsevier Ltd
publishDate 2023
url http://eprints.utm.my/106425/
http://dx.doi.org/10.1016/j.cie.2023.109667
_version_ 1803335004905799680
score 13.188404