Тёмный

Function Calling using Open Source LLM (Mistral 7B) 

AI Anytime
Подписаться 27 тыс.
Просмотров 14 тыс.
50% 1

In this tutorial, I delve into the fascinating world of function calling using the open source Large Language Model (LLM), Mistral 7B. Function calling is a powerful tool that can significantly enhance the capabilities of Gen AI applications. It allows for the integration of external web APIs, the execution of custom SQL queries, and the development of stable, reliable AI applications. By leveraging function calling, we can extract and leverage relevant information from diverse data sources, opening up a plethora of possibilities for developers and researchers alike.
Throughout this video, I demonstrate how to effectively utilize Mistral 7B for function calling. Whether you're looking to integrate external data into your AI project, execute complex queries, or simply explore the potential of open-source LLMs, this tutorial has got you covered.
Your support is invaluable to the continuation and improvement of content like this. If you found this tutorial helpful, please don't forget to like, comment, and subscribe to the channel.
GitHub: github.com/AIAnytime/Function...
To further support the channel, you can contribute via the following methods:
Bitcoin Address: 32zhmo5T9jvu8gJDGW3LTuKBM1KPMHoCsW
UPI: sonu1000raw@ybl
Join this channel to get access to perks:
/ @aianytime
Your contributions help in sustaining this channel and in the creation of informative and engaging content. Thank you for your support, and I look forward to bringing you more tutorials and insights into the world of Gen AI.
#llm #mistral #ai

Наука

Опубликовано:

 

10 фев 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 45   
@hanantabak
@hanantabak 22 дня назад
Thanks for being both informative and also honest without cutting the attempts that didn’t work at first. It makes this video spontaneous and organic.
@artsofpixel
@artsofpixel 5 месяцев назад
This was exactly what i was wondering just few days back.. Thanks man appreciate it ❤
@AIAnytime
@AIAnytime 5 месяцев назад
Any time! 😊
@TheFxdstudios
@TheFxdstudios 5 месяцев назад
Appreciate this, was just starting this journey.
@AIAnytime
@AIAnytime 5 месяцев назад
Thank you sir 🙏
@TheFxdstudios
@TheFxdstudios 5 месяцев назад
Yw, function calling isn’t exactly straightforward to get working properly. Vid definitely helps!
@user-iu4id3eh1x
@user-iu4id3eh1x 5 месяцев назад
Wow... This is amazing
@skeptomai97
@skeptomai97 5 месяцев назад
Excellent work!
@AIAnytime
@AIAnytime 5 месяцев назад
Glad you like it!
@lalluyoutub
@lalluyoutub 4 месяца назад
Your content is helpful. Thank you. Is there an implementation for dataframelookup using LLM function calls?
@aveek4902
@aveek4902 4 месяца назад
Hey, awesome work! However, I am wondering about this function calling solution, whether it will still be as robust as function calling is through the OpenAI API / langchain / other API's like the new mistral one. It seems like all it involves is just telling the LLM to output the information in a JSON schema format according to the specified functions, and then playing around with the string output to extract the functions? Let me know if you have any thoughts on whether this approach you used is fully reliable in forcing the model to maintain the correct format. Otherwise, really appreciate the video and tutorial, was very helpful!
@xspydazx
@xspydazx 2 месяца назад
yes ... sorry i just read you comment after posting :
@VijayDChauhaan
@VijayDChauhaan 5 месяцев назад
Been Waiting for function calling tutorial from your channel🦁 Aaj sher bs video dekhega kal implement karega😅
@AIAnytime
@AIAnytime 5 месяцев назад
Haha, nice! I already have one created when OpenAI launched function calling: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-aKkr_lgmihw.html
@VijayDChauhaan
@VijayDChauhaan 5 месяцев назад
​@@AIAnytimeI ment to say with open source... But you are the one keeping me up to date in this field bro❤❤❤
@akash_a_desai
@akash_a_desai 5 месяцев назад
Thank for this
@akash_a_desai
@akash_a_desai 5 месяцев назад
Waiting for next part.rag based & api based function calling
@AIAnytime
@AIAnytime 5 месяцев назад
Sure Akash. That's on the cards!
@EmilioGagliardi
@EmilioGagliardi 5 месяцев назад
New to this technology, so trying to wrap my brain around it. How does this apply to building agents in AutoGen or CrewAI? Does this video imply that you can't use locally hosted LLMs for agents? My little understanding is that agents use skills and skills are python functions...do I need to use the concepts you discussed in this in order to use local open-source models for agents or am I missing something? Thanks!
@Stewz66
@Stewz66 5 месяцев назад
He is not speaking to using open source llms in autogen/crewAI. His tutorial is not directed toward autogen/crewAI, though I suppose there might be a way to do function calling in those environments, but it's beyond my skill or experience to comment in this regard.
@vedarutvija
@vedarutvija 3 месяца назад
can u please show a function call example using api? for example weather api
@SonGoku-pc7jl
@SonGoku-pc7jl 5 месяцев назад
thanks! :)
@AIAnytime
@AIAnytime 5 месяцев назад
Welcome!
@vedarutvija
@vedarutvija 3 месяца назад
I followed the same code and approach but I dont see any out put other than this: Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained. Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
@xspydazx
@xspydazx 2 месяца назад
i think we also need to capture the call for the function and execcute the function .... on open-interpreter ? instructor to create templaes for an exepcted output, and interpreter to execute the detected function... i see that we can add these funcnctions to the wrapper of openAI... so if we use olama or lmstudio to host we can Add the functions to the OpenAI(Client).... I noticed the model acted different when using a huggingface weights from using the gguf with llamacpp and from calling it via the api server llama/lmstudio ? but for function calling it worked best on (lmstudio api) ...with the open AI client: ... but with the openinterpretor / insructor combo you can build your own method... as the open interpretor is also a wrapper which intercepts the responses (same as you did) and executes these functions after removing them from the input as passing the message along the wraper chain ... (hence you dont need chains either... as your creating your own chians)...
@xspydazx
@xspydazx 2 месяца назад
Nice : it seemed to work (not great) but its the edge of tech!
@TheRamseven
@TheRamseven 5 месяцев назад
Thanks for the video, Is possible add externals api or only through of libraries?
@AIAnytime
@AIAnytime 5 месяцев назад
Yes, absolutely
@muhammadadribmahmud5012
@muhammadadribmahmud5012 5 месяцев назад
Is it possible to function call using Qwen 0.5b model ?
@xspydazx
@xspydazx Месяц назад
yes ... if you intercept the output first then you can check if its a function call if so execute the function first then return the result to the model then when it returns the response it will be complete ... check your response messages for funciton call tag or final output tag... or create the template for a single function ... pass only this funciton : use that to execute the pyscript on the system or jupiter cell (remote) retun the result ... all cells executed on notebook have the same format so the moel can just write any function it needs just return the output....bash or python ...
@jatinkashyap1491
@jatinkashyap1491 3 месяца назад
Appreciate the content. Can't access the Google Colab notebook. Thanks.
@xspydazx
@xspydazx Месяц назад
down load the github ..
@Cam0814
@Cam0814 5 месяцев назад
Can you show and example of using agents with langGraph?
@AIAnytime
@AIAnytime 5 месяцев назад
Look at my LangGraph video.
@Cam0814
@Cam0814 5 месяцев назад
@@AIAnytime Looks like your using openai in that video. I would like to do it with mistral or 7b
@kai-yihsu3556
@kai-yihsu3556 3 месяца назад
@@Cam0814 Me too we want to use Mistral at LangGraph Function calling
@nikitkashyap9192
@nikitkashyap9192 5 месяцев назад
hi sir i am nikit, i have some problem choosing best rag pipeline. Can you help me in that?
@AIAnytime
@AIAnytime 5 месяцев назад
Yes, sure
@WalkthroughOfHell
@WalkthroughOfHell 4 месяца назад
hello Nikit, i'm working on RAG system now, do you mind I get your contact to discuss about RAG system ?
@parkersettle460
@parkersettle460 5 месяцев назад
Hey are you still active, could I get some help?
@AIAnytime
@AIAnytime 5 месяцев назад
Sure
@parkersettle460
@parkersettle460 4 месяца назад
What’s your discord?
@parkersettle460
@parkersettle460 4 месяца назад
@@AIAnytime what is your discord?
@ChopLabalagun
@ChopLabalagun 4 месяца назад
🤫
Далее
Integrating Generative AI Models with Amazon Bedrock
14:19
Копия iPhone с WildBerries
01:00
Просмотров 2,8 млн
The Future of Knowledge Assistants: Jerry Liu
16:55
Просмотров 32 тыс.
No, You DON'T NEED OpenAI Function Calling!!!!
17:29
Просмотров 13 тыс.
How to Improve LLMs with RAG (Overview + Python Code)
21:41
Function Calling with Mistral AI
6:04
Просмотров 16 тыс.
АЙФОН Г0ВН0
0:54
Просмотров 1 млн
Choose a phone for your mom
0:20
Просмотров 7 млн
iPhone 16 - НЕ СТОИТ ПРОПУСКАТЬ
4:50
Aura 879dsp новинка и хит
0:48
Просмотров 168 тыс.