search for




 

다중 시계열을 이용한 장기 예측 Transformer 모델
Long-term forecasting using transformer based on multiple time series
Korean J Appl Stat 2024;37(5):583-598
Published online October 31, 2024
© 2024 The Korean Statistical Society.

이재용a, 김현준a, 임창원1,a
Jaeyong Leea, Hyun Jun Kima, Changwon Lim1,a

a중앙대학교 응용통계학과

aDepartment of Applied Statistics, Chung-Ang University
1Department of Applied Statistics, Chung-Ang University, 84 Heukseok-ro, Dongjak-Gu, Seoul 06974, Korea. E-mail: clim@cau.ac.kr
Received August 5, 2024; Revised August 26, 2024; Accepted August 26, 2024.
Abstract
많은 현대 연구에서는 시계열 예측 모델을 위해 recurrent nueral networks (RNN) 혹은 long short-term memory (LSTM)과 같은 인공지능 기술의 적용을 탐구한다. 이러한 인공지능 모델 중에서도 자연어 처리를 위해 처음 개발된 모델인 transformer는 큰 주목을 받고 있다. 그럼에도 불구하고, 많은 시계열 예측 모델은 장기 예측을 적절히 다루지 못하고 있다. 따라서 본 연구에서는 “목표 시계열” 과 예측에 영향을 미칠 수 있는 다수의 “참조 시계열”을 포함하는 트랜스포머 아키텍처 기반의 장기 예측 모델을 제안한다.
Numerous contemporary studies are exploring the application of artificial intelligence techniques such as recurrent neural networks (RNN) and long short-term memory (LSTM) for time series forecasting models. Among these AI models, the Transformer, which is a high-performance model initially developed for natural language processing, has gained significant attention. Despite this, many time series forecasting models do not adequately address long-term prediction. Therefore, this study seeks to develop a long-term forecasting model based on the Transformer architecture, incorporating a “target time series” and a multiple “reference time series” that may influence the forecast.
주요어 : 인공지능, 딥러닝, 트랜스포머, 시계열 예측
Keywords : artificial intelligence, deep learning, transformer, time series forecasting
References
  1. Dosovitskiy A, Beyer L, Kolesnikov A et al. (2020). An image is worth 16x16 words: Transformers for image recognition at scale,
  2. Gomez AN, Ren M, Urtasun R, and Grosse RB (2017). The reversible residual network: Backpropagation without storing activations, Advances in Neural Information Processing Systems, 2017-December, 2215-2225.
  3. He K, Zhang X, Ren S, and Sun J (2016). Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, Las Vegas, NV, 770-778.
    CrossRef
  4. Hochreiter S and Schmidhuber J (1997). Long short-term memory, Neural Computation, 9, 1735-1780.
    Pubmed CrossRef
  5. Joo IY (2012). A case study on crime prediction using time series models, Korean Security Journal, 139-169.
  6. Kim D-K and Kim K (2021). Style-based transformer for time series forecasting, The Transactions of the Korea Information Processing Society, 10, 579-586.
  7. Kim J, Choi Y-T, Lee S-H, and Woo SI (2020) Long-term settlement prediction of railway concrete track based on recurrent neural network (RNN), Journal of the Korean Geotechnical Society, 36, 5-14.
  8. Kitaev N, Kaiser Ł, and Levskaya A (2020). Reformer: The efficient transformer,
  9. Koo M-W (2021). A Korean speech recognition based on conformer, The Journal of the Acoustical Society of Korea, 40, 488-495.
  10. Lee S and Lee J-H (2016). Customer churn prediction using RNN, Proceedings of the Korean Society of Compute r Information Conference, 24, 45-48.
  11. Lee YJ (2017, October 26). Why is 30-year-old whiskey expensive?... 3% evaporates each year during aging, Korea Economic Daily,
  12. Rumelhart DE, Hinton GE, and Williams RJ (1986). Learning representations by back-propagating errors, Nature, 323, 533-536.
    CrossRef
  13. Rural Development Administration (2019, June 21). The uses and components of mature ginseng and seedling ginseng are different, Korea Policy Briefing,
  14. Shim J-W and Chung E-C (2010). A time series analysis on ratio of Chonsei to monthly rent with variable deposit contracts of apartments, Housing Studies Review, 18, 5-30.
  15. Song H J, Choi H S, Kim S W, and Oh S-H (2019). A study on financial time series data volatility prediction method using AI’s LSTM method, Journal of Knowledge Information Technology and Systems, 14, 665-673.
    CrossRef
  16. Vaswani A, Shazeer N, Parmar N et al. (2017). Attention is all you need, Advances in Neural Information Processing Systems, 2017-December, 6000-6010.
  17. Zhou H, Zhang S, Peng J, Zhang S, Li J, Xiong H, and Zhang W (2021). Informer: Beyond efficient transformer for long sequence time-series forecasting, Proceedings of the AAAI Conference on Artificial Intelligence, 35, 11106-11115.
    CrossRef
October 2024, 37 (5)