Тёмный

What is context Length in LLM? 

Decoding Data Science
Подписаться 1,3 тыс.
Просмотров 838
50% 1

Опубликовано:

 

12 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 5   
@maryrosedelrosario6084
@maryrosedelrosario6084 4 месяца назад
Thanks for the info!
@tokhenz
@tokhenz 5 месяцев назад
thank you for the explanation
@decodingdatascience
@decodingdatascience 5 месяцев назад
Thank you do subscribe
@dingzhang46
@dingzhang46 4 месяца назад
Thanks for your explaination. I know "context length" is the maximum input that model can process at one, this is for prompt input? So can you explain the maximum of model completion. And what are 2 parameters named in coding (Huggingface)? Thank you so much
@decodingdatascience
@decodingdatascience 8 месяцев назад
🎯 Key Takeaways for quick navigation: 00:00 📚 *Context window and length are crucial components in language models like LLM.* 00:13 🔄 *Longer context length allows models to generate better responses by considering both prompt and completion.* 00:55 📝 *Different language models have varying context lengths, e.g., GPT-3 with 2,400 tokens and CLAWD with 100K tokens.* 01:50 🧾 *Context length is counted in tokens, not words, and you should reserve space for responses.* 02:30 💻 *You can use AI tools like Cloud to summarize large amounts of text based on context length.* 03:22 🤖 *Understanding context length is essential for leveraging large language models effectively.* Made with HARPA AI
Далее
What are the LLM’s Top-P + Top-K ?
6:00
Просмотров 3,8 тыс.
What Are Tokens in Large Language Models? #llm #ai
2:08
Exploring the Rise of Small Language Models
2:42
Просмотров 4,8 тыс.