references

references

 

Chapter 1

[1] D.D. Placido, “Toys ‘R’ Us AI-Generated Ad Controversy, Explained,” Forbes, June 26, 2024. https://www.forbes.com/sites/danidiplacido/2024/06/26/the-toys-r-us-ai-generated-ad-controversy-explained/

[2] “What Is a Foundation Model? An Explainer for Non-Experts,” Stanford HAI. https://hai.stanford.edu/news/what-foundation-model-explainer-non-experts

[3] A. Vaswani et al., “Attention Is All You Need,” June 2017. https://arxiv.org/pdf/1706.03762

Chapter 2

[1] B. Oreshkin, D. Carpov, N. Chapados, and Y. Bengio, “N-BEATS: Neural basis expansion analysis for interpretable time series forecasting,” paper presented at ICLR 2020, June 29, 2024. https://arxiv.org/pdf/1905.10437

Chapter 3

[1] A. Garza, C. Challu, and M. Mergenthaler-Canseco, “TimeGPT-1.” https://arxiv.org/pdf/2310.03589

Chapter 4

[1] K. Rasul et al., “Lag-Llama: Towards Foundation Models for Probabilistic Time Series Forecasting.” https://arxiv.org/pdf/2310.08278

[2] “time-series-foundation-models/lag-llama,” GitHub, June 30, 2024. https://github.com/time-series-foundation-models/lag-llama

Chapter 5

[1] A. Fatir Ansari et al., “Chronos: Learning the Language of Time Series.” https://arxiv.org/pdf/2403.07815

[2] “Exploring Transfer Learning with T5: the Text-To-Text Transfer Transformer,” Google Research. https://research.google/blog/exploring-transfer-learning-with-t5-the-text-to-text-transfer-transformer/

Chapter 6

Chapter 7

Chapter 8

Chapter 9