You pin the current date in the search tool and only supply the topic as a function parameter. Would it be possible to use functions with more than one parameter or should I work around that in the prompt, I.e. the agent should create, for example, a JSON string with the parameters and I would have to parse that in the function used as a tool?
Great video! How can I use a .env file in Python with Swarm() instead of setting OS environment variables? I know with openai you can pass api_key to it, but Swarm does not take that parameter. Trying to get this to work with Mistral large model so need api_key and base_url parameters. Thanks!
Can you make a guide for a non-coder guide, From basic to advance ai agents, in which modules could be What are the tools needs, How to setups those tools. And how to create the agents.
You are correct. Than you :) Probably I shouldn't have included that function. Updated code in the description I need to test, how reliable it is to use multiple functions and agents inside one agent.
@unclecode based on my test, it seems llama3.2 is not able to call multiple tools at the same time. But gpt-4o was able to. So to run multi agents with small models this is optimal solution (Code in the description) Technical: gpt-4o was able to handle transfer_to_editor_assistant and get_news_articles at the same time. But llama 3.2 used only this get_news_articles, even though both functions were provided. Based on my 3 tests.
I am using a virtual environment using conda and in vs code the ollama command is recognized and I can run the model from this environment in vs code but I get the error that gpt-4o not found error. Why? I set up the environment variable already and verified it. "openai.NotFoundError: Error code: 404 - {'error': {'message': 'The model `gpt-4o` does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}"
code shown in youtube and blog it is not working. for this error you need to add model=os.environ["OPENAI_MODEL_NAME"] to fix error but still it seems like the total work flow does not work for ollama. especially transfering agent.
Sorry for the confusion. Here is the fixed code without that function: mer.vin/2024/10/openai-swarm-local/ also i have added the model name in the code. Passed to the Agent function
Error code: 404 - {'error': {'message': 'The model `llama3.2` does not exist or you do not have access to it. and it is going to openai instead of my local ollama Kindly help .
now I am trying to use "run_demo_loop" and again giving same error. But I already passed the openai object to swarm. dont know why always like this :p some or other things always ask for money :d
Yes it can. Langchain can be used for beginners. But Advanced users want to know whats happening behind the scenes when they run an agent, so in that situation OpenAI Swarm is better.