Title |
A Novel Transformer-Based Model for Power Consumption Prediction |
Authors |
김광현(Gwang-Hyeon Kim) ; 오하령(Ha-Ryoung Oh) ; 성영락(Young-Rak Seong) |
DOI |
https://doi.org/10.5370/KIEE.2024.73.12.2333 |
Keywords |
Transformer; Time-Series Forecasting; Power Consumption; MLP; Depthwise Separable Convolution |
Abstract |
Recently, Transformer has demonstrated excellent performance by effectively addressing the long-term dependency issues of traditional time series prediction models. However, Transformer also has limitations, such as a lack of ability to learn the sequential characteristics of time series data and the need for significant computational resources due to their complex structure. This paper proposes an EMformer model based on the Transformer architecture to overcome these limitations and improve prediction accuracy. EMformer reduces computational costs by replacing the decoder with an MLP and enhances sequential feature extraction by replacing the feed-forward neural network in the encoder with depthwise separable convolution. The model's performance is evaluated using power consumption dataset and compared with other time series prediction models. The results show that EMformer improve performance by up to 57.3% in MAPE and 30.75% in RMSE compared to other models. |