Transformer-based deep learning architecture for time series forecasting[Formula presented]
Loading...
Date
2024
Journal Title
Software Impacts
Journal ISSN
Volume Title
Publisher
Elsevier B.V.
Abstract
Time series forecasting faces challenges due to the non-stationarity, nonlinearity, and chaotic nature of the data. Traditional deep learning models like RNNs, LSTMs, and GRUs process data sequentially but are inefficient for long sequences. To overcome the limitations of these models, we proposed a transformer-based deep learning architecture utilizing an attention mechanism for parallel processing, enhancing prediction accuracy and efficiency. This paper presents user-friendly code for the implementation of the proposed transformer-based deep learning architecture utilizing an attention mechanism for parallel processing. � 2024
Description
Keywords
Deep learning, Time series forecasting, Transformer