Тёмный

Whiteboarding - Large Language Model (LLM) Tech Stack 

650 AI Lab
Подписаться 9 тыс.
Просмотров 1,3 тыс.
50% 1

Опубликовано:

 

18 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 12   
@emrahe468
@emrahe468 Год назад
This is probably most complete and clean LLM lecture in the entire internet. Thank you for your efforts and knowledge I'm currently using the GPT-4 API, which provides extremely accurate results for my project, but it can be expensive if used excessively. As a cost-saving measure, I'm considering using one of the downloadable models you recommended. However, I'm wondering if it's possible to train these models with custom data. Please keep in mind that my computer has limited resources, and the most I can use is Colab+ (the enterprise version is not an option for me).
@650AILab
@650AILab Год назад
Everyone one has exact same concern with the OpenAI API cost, however none (in my opinion) of the open source LLM models are as performant and accurate as OpenAPI GPT-x models. Others will contradict with my above statement however for certain business use cases GPT-x models are still top so I understand what you say.. Please try GPT4ALL and CrebrasGPT as open source models to see if they help you out www.cerebras.net/blog/cerebras-gpt-a-family-of-open-compute-efficient-large-language-models/ Thanks for the comment, appreciate it sincerely.
@emrahe468
@emrahe468 Год назад
@@650AILab Yes, seems like training a custom LLM model requires tons of expertise, resources and money (after checking the link, realized that training from scratch services may go even over 500K USD). I better stick with gpt4 for now 😅 and wait for your videos if anything new happens Thank you for the clean explanation and informative link
@650AILab
@650AILab Год назад
@@emrahe468 Please check out my latest video released today which is on the same topic, training or turning LLM for enterprise - ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-5devEOcgrG0.html Thanks.
@Priming-AI
@Priming-AI 5 месяцев назад
Estou fazendo uma maratona em seu canal, fantastico ! muito obrigado Avkash Chauhan !
@lbf5984
@lbf5984 Год назад
Chat GPT-4 has been telling me about a potential tech stack, but it changes every time I ask...
@650AILab
@650AILab Год назад
Thanks for your comment, it's wonderful. Please remember here we are talking about how to use LLM based on the models are available, so the general tech stack and the content we talked here are different.
@lbf5984
@lbf5984 Год назад
@Prodramp that's what I meant, I'm trying to host local llm conversational bot on my local machine, but chat gpt was the only thing I could ask questions too. It even broke it down to 40 key steps beginning with what Linux packages I needed
@shivamkumar-qp1jm
@shivamkumar-qp1jm Год назад
Can I download my trained weights ? Why should I authenticate If I can deploy on my local machine
@650AILab
@650AILab Год назад
Thanks for the comment, appreciate it sincerely. You need to separate training from fine-tuning. In the training you have full control and the updated model will have all the trained weight which you can analyze as the way you want. The fine-tuned data is converted into embedding vector which can be stored in the vector storage of your choice.
@user-wr4yl7tx3w
@user-wr4yl7tx3w Год назад
Audio could be better. There’s an echo I think.
@650AILab
@650AILab Год назад
Thanks for the comment, appreciate it sincerely.
Далее
Whiteboarding - Let's understand ChatGPT
15:11
For my passenger princess ❤️ #tiktok #elsarca
00:24
Enabling Cost-Efficient LLM Serving with Ray Serve
30:28
Large Language Models from scratch
8:25
Просмотров 345 тыс.