In this video, I discuss my latest project on fine-tuning LLaMA3 and TinyLlama to natively support function calls, which is crucial for the development of AI agents in the open-source community.
I cover the following topics:
- Colab notebooks demonstrating how to run the models using helper - classes and GGFU versions
- Examples of using the models locally and with the Ollama server
- Prompt templates and usage guidelines
I also mention my plans for future improvements, such as multi-function detection, function binding, and fine-tuning models with less than 1B parameters.
This video provides an overview of my project and how you can start using these function-calling LLMs in your own projects.
Access the models, dataset, and repository through the links in the description. If you have any questions or want to contribute, feel free to open an issue on my GitHub repo: github.com/unc...
You can find models on HuggingFace: huggingface.co...
Follow me on Twitter (X) for updates on my research on function-calling for LLMs and AI agents: x.com/unclecode
I appreciate your feedback and thoughts on this project.
#OpenSourceAI #FunctionCalling #LLaMA3 #TinyLlama
29 сен 2024