Тёмный
No video :(

CPU-based SLMs for AI Agents and Function Calling by LLMWare 

AI Anytime
Подписаться 29 тыс.
Просмотров 3,7 тыс.
50% 1

In this comprehensive tutorial, I dive into the groundbreaking world of SLIM models (Structured Language Instruction Models) developed by LLMWare, showcasing their incredible potential for AI agents and function calling tasks. SLIM models are ingeniously designed to generate structured outputs, making them a perfect fit for complex, multi-step, and multi-model LLM-based automation workflows.
Throughout the video, I explore the capabilities of all 10 SLIM models, including the remarkable 'slim-sentiment-tool'. This tool, a 4_K_M quantized GGUF version of slim-sentiment, exemplifies the series' focus on providing small, specialized decoder-based LLMs. These models are fine-tuned for function calling, ensuring swift and efficient inference, optimized for multi-model concurrent deployment on CPU.
By integrating these SLIM models into a Streamlit app, I demonstrate their practical applications in real-world scenarios, offering insights into how they can enhance the capabilities of AI agents for various function-calling tasks.
Whether you're an AI enthusiast, a developer looking to implement LLM-based solutions, or someone curious about the latest advancements in AI technology, this tutorial will provide you with a solid understanding of SLIM models and their applications.
Don't forget to LIKE, COMMENT, and SUBSCRIBE for more content on innovative AI solutions and how they can transform the digital landscape. Your support helps me create more tutorials like this, exploring the cutting edge of AI technology and its practical applications.
Streamlit App Code: github.com/AIA...
GitHub LLMWare: github.com/llm...
LLMWare HF: huggingface.co...
Join this channel to get access to perks:
/ @aianytime
To further support the channel, you can contribute via the following methods:
Bitcoin Address: 32zhmo5T9jvu8gJDGW3LTuKBM1KPMHoCsW
UPI: sonu1000raw@ybl
#ai #llm #generativeai

Опубликовано:

 

6 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 23   
@MrKB_SSJ2
@MrKB_SSJ2 6 месяцев назад
I rely on your channel to stay updated and skilled in the field of generative AI. THANK YOU!!!
@AIAnytime
@AIAnytime 6 месяцев назад
Glad you enjoy it!
@sidindian1982
@sidindian1982 5 месяцев назад
Sir , LLMWare Module not Found -- ERROR .. i check the dependencies on the requirement.txt file .. Everything seems to be fine .. while running the code .. streamlit run app.py error display as LLMware module not found . in Venv - i tried to install Pip install LLMWare .. & installation i checked the version of LLMWare .. its says No internal /external cmd found.. 😞
@shivamkumar-qp1jm
@shivamkumar-qp1jm 6 месяцев назад
Thanks for this Whenever I give a demo in front of a client and use a large language model, I worry whether it will provide structured output or if it will break my code. We can only use local large language models.
@faizahmed8015
@faizahmed8015 6 месяцев назад
Bro how do you always stay updated with LLMs model and how you understand and learn quickly every model, would like share some tips or suggestion?
@anubisai
@anubisai 6 месяцев назад
You could start by typing the same question into GPT4 and following the instructions it provides for getting updates automatically from various sources....
@chopwhoopz
@chopwhoopz 6 месяцев назад
Where is the function calling part on the video ?
@TheFxdstudios
@TheFxdstudios 6 месяцев назад
🔥
@user-kl7kr6lc8r
@user-kl7kr6lc8r 6 месяцев назад
thank you
@AIAnytime
@AIAnytime 6 месяцев назад
Welcome!
@llmware
@llmware 6 месяцев назад
Thank you so much for this excellent video on our SLIM models! Clear, detailed and very insightful information as always!💥💫 For more information on SLIMs please check out this video as well: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-cQfdaTcmBpY.html
@shaiknaveed78
@shaiknaveed78 6 месяцев назад
Your videot arw awesome as always. Please do a video on Semantic Kernel
@AIAnytime
@AIAnytime 6 месяцев назад
Great suggestion!
@unclecode
@unclecode 6 месяцев назад
I dunno why you call this function calling, it doesn't seem like it to me. It's just calling small models for specific tasks, and the Transformer library already has similar things. To me, function calling is when LLM is trained to extract info based on a schema within a convo, then we use that as input for a function, get the result, and pass it back to the model. So I don't get why you call this function calling. It looks like just smaller models to handle things to me. Please help me if I am missing anything here. Btw Thx for your videos as usual.
@AIAnytime
@AIAnytime 6 месяцев назад
SLIM makes function calling easier.... The model has been created for function calling and building AI agents. You can use these models with LLMs to do function calls. I have shown in the second last video on my channel. Have a look at that.... Thanks
@user-ew4lg3th6h
@user-ew4lg3th6h 6 месяцев назад
@AIAnytime How can we fine-tune these slim models.
@luigitech3169
@luigitech3169 6 месяцев назад
great! where is defined the schema of the json responses ?
@hoangphamhuy9096
@hoangphamhuy9096 6 месяцев назад
hi, do you have any video about create text to speech, and speech using custom model?
@user-iu4id3eh1x
@user-iu4id3eh1x 6 месяцев назад
Thank you sir
@SonGoku-pc7jl
@SonGoku-pc7jl 6 месяцев назад
amazing! :) thanks!
@AIAnytime
@AIAnytime 6 месяцев назад
Glad you like it!
@mcmarvin7843
@mcmarvin7843 6 месяцев назад
Very good
@akash_a_desai
@akash_a_desai 6 месяцев назад
How can we create data for function calling? & fine tune our own llm
Далее
НЕДОВОЛЬНА УСЛУГОЙ #shorts
00:27
Просмотров 21 тыс.
How to get Spongebob El Primo FOR FREE!
01:36
Просмотров 15 млн
Have You Picked the Wrong AI Agent Framework?
13:10
Просмотров 64 тыс.
Function Calling using Open Source LLM (Mistral 7B)
25:50