Тёмный

Fully local tool calling with Ollama 

LangChain
Подписаться 58 тыс.
Просмотров 18 тыс.
50% 1

Tools are utilities (e.g., APIs or custom functions) that can be called by an LLM, giving the model new capabilities. However, LLMs need to be able to 1) select the correct tool and 2) form the correct tool input. To date, both have been challenging w/ local LLMs. However, Ollama recently added function calling and we've incorporated this into a new partner package. Here, we show how to use the new Ollama partner package to perform tool calling w/ the recent Groq fine-tune of Llama-3 8b. As an example, we show how to create a simple tool calling agent in LangGraph with web search and vectorstore retrieval tools that runs locally.
Llama-3-Groq-Tool-Use via Ollama -
ollama.com/lib...
Blog post for Llama-3-Groq-Tool-Use -
wow.groq.com/i...
Code:
github.com/lan...

Опубликовано:

 

3 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 40   
@TheYoungSoul
@TheYoungSoul 2 месяца назад
Thank you for this example!! I just ran through this example using llama3.1 8B model - and it worked flawlessly. llama3 does not work - but the 3.1 model did. I actually did not expect that
@automatalearninglab
@automatalearninglab 2 месяца назад
Nice! Love it! I was looking for something like this today! So glad I decided to catch up on my langchain videos! hehe Cheers!
@davesabra4320
@davesabra4320 2 месяца назад
very very clearly explained. Thanks.
@blanky0230
@blanky0230 2 месяца назад
Still killing it
@Imran-Alii
@Imran-Alii 2 месяца назад
Awesome!!!
@IdPreferNot1
@IdPreferNot1 2 месяца назад
This is THE content! Please take it to the top ---> source code link for longer script?
@JDWilsonJr
@JDWilsonJr 2 месяца назад
Hello @IdPreferNot1. Apologies as I do not understand. May I trouble you for specific instructions to see the link to the notebook. I am clearly missing something. Thank you for your help.
@IdPreferNot1
@IdPreferNot1 2 месяца назад
@@JDWilsonJr Im saying its great content.. he'd make it better if he shared the source code he went through. :)
@LangChain
@LangChain 2 месяца назад
@@IdPreferNot1 Here it is! github.com/langchain-ai/langgraph/blob/main/examples/tutorials/tool-calling-agent-local.ipynb
@JDWilsonJr
@JDWilsonJr 2 месяца назад
Hello Lance. Great presentation. Looking everywhere for your jupyter notebook. You introduce so many new concepts in your tutorials that it is almost impossible to reproduce visually from the video. I see the version you used in the video remained untitled through the end. Will you be posting the notebook in github examples like you have in the past? Your work is amazing and valuable and we are scrambling to catch up!
@LangChain
@LangChain 2 месяца назад
Here it is! github.com/langchain-ai/langgraph/blob/main/examples/tutorials/tool-calling-agent-local.ipynb
@JDWilsonJr
@JDWilsonJr 2 месяца назад
@@LangChain Sooo appreciate your response and the link. Keep up the great work.
@kyudechama
@kyudechama 2 месяца назад
Somehow I only get one tool call in my list as an answer. Even if I ask a question that would warrant multiple tool calls as a response. The ollama API is able to return multiple tool calls, openAI as well. I tried several models, including llama3.1, llama3, firefunctionv2 and the groq versions. Could it be your system prompt that prevents returning multiple function calls?
@omni9796
@omni9796 2 месяца назад
Great video! Is this also available for Node?
@dimosdennis
@dimosdennis 2 месяца назад
It is good, but it is still not there. I did several tests where i give it two dummy tools to use, and it is able to distinguish quite effectively - however it will always call the tools, even when asked not too. Tried different prompts, no good. Still, it is better than it was, and the package is nice :)
@ai-touch9
@ai-touch9 2 месяца назад
excellent work, as usually..:) can you share the code link.
@LangChain
@LangChain 2 месяца назад
Here it is! github.com/langchain-ai/langgraph/blob/main/examples/tutorials/tool-calling-agent-local.ipynb
@jonasopina3529
@jonasopina3529 2 месяца назад
I'm trying to use it with ollama, but from my other compute in the same network, and can't set the base_url. I'm trying to set it like llm = ChatOllama(model="modelName", base_url="http::11434"...), but it doesn't work.
@alenjosesr3160
@alenjosesr3160 2 месяца назад
def bind_tools( self, tools: Sequence[Union[Dict[str, Any], Type[BaseModel], Callable, BaseTool]], **kwargs: Any, ) -> Runnable[LanguageModelInput, BaseMessage]: raise NotImplementedError() ollama bind tool says, not implemented.
@utkucanaytac5417
@utkucanaytac5417 2 месяца назад
use from langchain_ollama import ChatOllama not with the one with community models
@user-mi8gf5ez5g
@user-mi8gf5ez5g 2 месяца назад
could you please share a notebook link? thanks for making these videos
@LangChain
@LangChain 2 месяца назад
Here it is! github.com/langchain-ai/langgraph/blob/main/examples/tutorials/tool-calling-agent-local.ipynb
@LyuboslavPetrov
@LyuboslavPetrov Месяц назад
Would be neat of NO proprietary/paid tools are used (e.g. for embedding or websearch). But, of course, no big deal to to this ourselves. Thank you
@eMotionAllDamAge_
@eMotionAllDamAge_ 2 месяца назад
Great content! Please, share the code 😃
@LangChain
@LangChain 2 месяца назад
Here it is! github.com/langchain-ai/langgraph/blob/main/examples/tutorials/tool-calling-agent-local.ipynb
@eMotionAllDamAge_
@eMotionAllDamAge_ 2 месяца назад
@@LangChain Thanks !
@kuruphaasan
@kuruphaasan Месяц назад
3:32 I get an empty array when I run the exact same code, can you help me here? My langchain-ollama package version is 0.1.1 and I have tried both llama3-groq fine-tuned model and llama3.1
@BushengZhang
@BushengZhang Месяц назад
Yes, I have encountered the same problem!!! I'm puzzled for half a day...
@kuruphaasan
@kuruphaasan Месяц назад
@@BushengZhang I am still not able to figure out the reason, I have checked with github issues also. Not sure if it's a bug or something else.
@BushengZhang
@BushengZhang Месяц назад
Oh, I have just find a solution, I changed ollamafunction to structuralize LLM's outputs, and it workded
@kuruphaasan
@kuruphaasan Месяц назад
@@BushengZhang Oh, great. Can you please share the example code?
@BnmQwr-e2n
@BnmQwr-e2n 29 дней назад
Lopez Linda Anderson Carol Martinez Jose
@bhaibhai-qe8tt
@bhaibhai-qe8tt Месяц назад
response = ChatOllama( ^^^^^^^^^^^ TypeError: 'method' object is not subscriptable
@AnthonyGarland
@AnthonyGarland 2 месяца назад
this is the code at about 3:48. from typing import List from typing_extensions import TypedDict from langchain_ollama import ChatOllama def validate_user(user_id: int, addresses: List) -> bool: """ Validate user using historical addresses. Args: user_id: (int) the user ID. addresses: Previous addresses. """ return True llm = ChatOllama( model="llama3-groq-tool-use", temperature=0, ) # %% llm_with_tool =llm.bind_tools([validate_user]) # %% result = llm_with_tool.invoke( "Could you validate user 123? They previously lived at " "123 Fake St in Boston MA and 234 Pretend Boulevard in " "Houston TX." ) result.tool_calls
@hor1zonLin
@hor1zonLin 2 месяца назад
why i use the same code but return [ ], the empty list?
@hor1zonLin
@hor1zonLin 2 месяца назад
from typing import List from langchain_ollama import ChatOllama from typing_extensions import TypedDict def validate_user(user_id: int, addresses: List) -> bool: """Validate user using historical addresses. Args: user_id: (int) the user ID. addresses: Previous addresses. """ return True llm = ChatOllama( model="llama3-groq-tool-use", temperature=0, ).bind_tools([validate_user]) result = llm.invoke( "Could you validate user 123? They previously lived at " "123 Fake St in Boston MA and 234 Pretend Boulevard in " "Houston TX." ) result.tool_calls [ ]
@kuruphaasan
@kuruphaasan Месяц назад
​@@hor1zonLin were you able to figure out and fix the issue?
@BushengZhang
@BushengZhang Месяц назад
@@kuruphaasanhor1zonLin I have the same problem!
Далее
Fully local RAG agents with Llama 3.1
20:04
Просмотров 50 тыс.
Сколько стоит ПП?
00:57
Просмотров 217 тыс.
#慧慧很努力#家庭搞笑#生活#亲子#记录
00:11
How does function calling with tools really work?
10:09
Python RAG Tutorial (with Local LLMs): AI For Your PDFs
21:33
LangGraph Engineer
6:23
Просмотров 18 тыс.
Reliable, fully local RAG agents with LLaMA3
21:19
Просмотров 113 тыс.
Tool Calling with LangChain
7:28
Просмотров 14 тыс.
Reliable, fully local RAG agents with LLaMA3.2-3b
31:04