참고
[1] R. Bommasani et al., “On the Opportunities and Risks of Foundation Models.” arXiv, Jul. 12, 2022. doi: 10.48550/arXiv.2108.07258.

[2] H. Xue and F. D. Salim, “PromptCast: A New Prompt-based Learning Paradigm for Time Series Forecasting,” arXiv.org. Accessed: Jan. 08, 2024. [Online]. Available: https://arxiv.org/abs/2210.08964v5

[3] D. Spathis and F. Kawsar, “The first step is the hardest: Pitfalls of Representing and Tokenizing Temporal Data for Large Language Models,” arXiv.org. Accessed: Jan. 08, 2024. [Online]. Available: https://arxiv.org/abs/2309.06236v1

[4] N. Gruver, M. Finzi, S. Qiu, and A. G. Wilson, “Large Language Models Are Zero-Shot Time Series Forecasters,” arXiv.org. Accessed: Jan. 08, 2024. [Online]. Available: https://arxiv.org/abs/2310.07820v1

[5] T. Zhou, P. Niu, X. Wang, L. Sun, and R. Jin, “One Fits All:Power General Time Series Analysis by Pretrained LM,” arXiv.org. Accessed: Jan. 08, 2024. [Online]. Available: https://arxiv.org/abs/2302.11939v6

[6] D. Cao et al., “TEMPO: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting,” arXiv.org. Accessed: Jan. 08, 2024. [Online]. Available: https://arxiv.org/abs/2310.04948v2

[7] C. Chang, W.-Y. Wang, W.-C. Peng, and T.-F. Chen, “LLM4TS: Aligning Pre-Trained LLMs as Data-Efficient Time-Series Forecasters,” arXiv.org. Accessed: Jan. 08, 2024. [Online]. Available: https://arxiv.org/abs/2308.08469v4

[8] M. Jin et al., “TIME-LLM: TIME SERIES FORECASTING BY REPRO- GRAMMING LARGE LANGUAGE MODELS”.

[9] A. Garza and M. Mergenthaler-Canseco, “TimeGPT-1.” arXiv, Oct. 05, 2023. Accessed: Feb. 14, 2024. [Online]. Available: http://arxiv.org/abs/2310.03589

[10] A. Das, W. Kong, R. Sen, and Y. Zhou, “A decoder-only foundation model for time-series forecasting.” arXiv, Feb. 04, 2024. Accessed: Feb. 13, 2024. [Online]. Available: http://arxiv.org/abs/2310.10688

[11] G. Woo, C. Liu, A. Kumar, C. Xiong, S. Savarese, and D. Sahoo, “Unified Training of Universal Time Series Forecasting Transformers.” arXiv, Feb. 04, 2024. Accessed: Feb. 13, 2024. [Online]. Available: http://arxiv.org/abs/2402.02592