Тёмный

Chronos: Learning the Language of Time Series with Abdul Fatir Ansari - 685 

The TWIML AI Podcast with Sam Charrington
Подписаться 19 тыс.
Просмотров 1,1 тыс.
50% 1

Today we're joined by Abdul Fatir Ansari, a machine learning scientist at AWS AI Labs in Berlin, to discuss his paper, "Chronos: Learning the Language of Time Series" - arxiv.org/abs/2403.07815. Fatir explains the challenges of leveraging pre-trained language models for time series forecasting. We explore the advantages of Chronos over statistical models, as well as its promising results in zero-shot forecasting benchmarks. Finally, we address critiques of Chronos, the ongoing research to improve synthetic data quality, and the potential for integrating Chronos into production systems.
🎧 / 🎥 Listen or watch the full episode on our page: twimlai.com/go/685.
🔔 Subscribe to our channel for more great content just like this: ru-vid.com?sub_confi...
🗣️ CONNECT WITH US!
===============================
Subscribe to the TWIML AI Podcast: twimlai.com/podcast/twimlai/
Follow us on Twitter: / twimlai
Follow us on LinkedIn: / twimlai
Join our Slack Community: twimlai.com/community/
Subscribe to our newsletter: twimlai.com/newsletter/
Want to get in touch? Send us a message: twimlai.com/contact/
📖 CHAPTERS
===============================
00:00 - Introduction
02:11 - Inspiration for Chronos
04:30 - Overview of statistical models
07:04 - Overfitting
08:17 - LLMs in time series forecasting
10:20 - Tokenization
15:25 - Why T5?
16:35 - Data augmentation
25:28 - Evaluation
27:45 - Result
31:15 - In domain vs zero shot
33:35 - Performance across different patterns
36:25 - Critique of Chronos
40:15 - Chronos in production
41:00 - Future of Chronos
42:00 - Conclusion
🔗 LINKS & RESOURCES
===============================
Large Language Models Are Zero-Shot Time Series Forecasters - arxiv.org/pdf/2310.07820
Time-LLM: Time Series Forecasting by Reprogramming Large Language Models - arxiv.org/abs/2310.01728
LLM4TS: Aligning Pre-Trained LLMs as Data-Efficient Time-Series Forecasters - arxiv.org/pdf/2308.08469
Lag-Llama: Towards Foundation Models for Probabilistic Time Series Forecasting
- arxiv.org/abs/2310.08278
Unified Training of Universal Time Series Forecasting Transformers (Moirai) - arxiv.org/abs/2402.02592
📸 Camera: amzn.to/3TQ3zsg
🎙️Microphone: amzn.to/3t5zXeV
🚦Lights: amzn.to/3TQlX49
🎛️ Audio Interface: amzn.to/3TVFAIq
🎚️ Stream Deck: amzn.to/3zzm7F5

Наука

Опубликовано:

 

7 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 3   
@syedmohammadghazi6133
@syedmohammadghazi6133 2 месяца назад
Hey, that's a good work... I've done my research on similar lines..gonna publish it end year. Just wanted to know have you heard about TranAD models? Although it's basically for anomaly detection, but just curious how better would it be for your use case
@btcoal
@btcoal 2 месяца назад
Paper link?
@twimlai
@twimlai 2 месяца назад
Hi @btcoal. Here's the paper link: arxiv.org/abs/2403.07815.
Далее
Chronos: Learning the Language of Time Series
44:34
Просмотров 1,7 тыс.
Are LLMs the Beginning or End of NLP?
1:00:56
Просмотров 26 тыс.
Why AI doesn't speak every language
10:15
Просмотров 565 тыс.
Exploring foundation models - Session 1
1:29:28
Просмотров 18 тыс.
Cool Gaming PC Build! #shorts
1:00
Просмотров 4,6 млн
Чем нельзя протирать экран?
0:44
Что стало с Windows Phone?
0:43
Просмотров 321 тыс.