Тёмный

LLaMA3 & TinyLlama Fine-Tuned for Function Calls 

Unclecode
Подписаться 667
Просмотров 2,5 тыс.
50% 1

In this video, I discuss my latest project on fine-tuning LLaMA3 and TinyLlama to natively support function calls, which is crucial for the development of AI agents in the open-source community.
I cover the following topics:
- Colab notebooks demonstrating how to run the models using helper - classes and GGFU versions
- Examples of using the models locally and with the Ollama server
- Prompt templates and usage guidelines
I also mention my plans for future improvements, such as multi-function detection, function binding, and fine-tuning models with less than 1B parameters.
This video provides an overview of my project and how you can start using these function-calling LLMs in your own projects.
Access the models, dataset, and repository through the links in the description. If you have any questions or want to contribute, feel free to open an issue on my GitHub repo: github.com/unc...
You can find models on HuggingFace: huggingface.co...
Follow me on Twitter (X) for updates on my research on function-calling for LLMs and AI agents: x.com/unclecode
I appreciate your feedback and thoughts on this project.
#OpenSourceAI #FunctionCalling #LLaMA3 #TinyLlama

Опубликовано:

 

29 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 11   
@lingling8333
@lingling8333 3 месяца назад
This video looks super interesting! I'm really impressed with your work. That's a huge step for open-source AI agent development. I'm curious about how large language models decide which API to call. It seems like it's all happening internally, like a "black box," and developers aren't using tools to analyze the output in real-time or directly control the process with functions. Is my understanding correct?
@surajitchakraborty1903
@surajitchakraborty1903 4 месяца назад
Hi, The GGUF colab notebook cannot be accesed
@unclecode788
@unclecode788 4 месяца назад
It should be by now. Please check again
@Maisonier
@Maisonier 4 месяца назад
Amazing, but is not available in LM Studio 😥😥 anyway, liked and subscribed. I should learn to use ollama.
@unclecode788
@unclecode788 4 месяца назад
You can load the GGFU version over there. Anyway, I'm working on a research project, and we're close to releasing what might be the smallest ever language model solely for function calls. Stay tuned!
@denijane89
@denijane89 4 месяца назад
Very nice work but please make your code fullscreen, the video like this with dark screen and darkened symbosl is super hard to follow.
@unclecode788
@unclecode788 4 месяца назад
Haha you right, I will consider that for the next video, stay tuned
@vedarutvija
@vedarutvija 4 месяца назад
can it run on free google colab ram?
@unclecode788
@unclecode788 4 месяца назад
Yes, L4 would be good enough
@jp2kk2
@jp2kk2 4 месяца назад
Muito legal, ficou simples de compreender
@unclecode788
@unclecode788 4 месяца назад
de nada.
Далее
This Llama 3 is powerful and uncensored, let’s run it
14:58
ДЕНЬ УЧИТЕЛЯ В ШКОЛЕ
01:00
Просмотров 1,6 млн
Crawl4AI - Crawl the web in an LLM-friendly Style
18:42
WorstFish: The Dumbest Chess Bot
17:09
Просмотров 1,2 млн
I love small and awesome models
11:43
Просмотров 16 тыс.
Fine-Tune Your Own Tiny-Llama on Custom Dataset
14:32