Тёмный
No video :(

Dynamic Few-shot Prompting with Llama 3 on local Environment | Ollama | Langchain | SQL Agent 

TheAILearner
Подписаться 848
Просмотров 1,4 тыс.
50% 1

This video teaches you how to implement dynamic few-shot prompting with open-source LLMs like Llama 3 using Langchain on local environment.
In this tutorial, we will follow these steps:
1. Import Llama3 : Begin by importing the necessary Llama3 library using Ollama.
2. Fetch SQL Data : Connect to your SQL database and fetch the data you need. This involves establishing a connection to sqlite database.
3. Initialize Few-Shot Examples : Select a few-shot learning approach by initializing a set of examples that will guide the model.
4. Convert Examples to Embeddings : Transform the few-shot examples into embeddings.
5. Create Custom Tools : Develop custom tools tailored to your specific needs (Here relate to SQL database).
6. Create Prompt: Design a prompt that will be used to interact with the model.
7. Create an Agent with ReAct Logic : Develop an agent that incorporates ReAct (Reasoning and Acting) logic. This agent will use the prompt and the few-shot examples to perform tasks interactively.
8. Agent Executor : Implement the agent executor, which will manage the execution of tasks by the agent. This component should handle the flow of information between the agent and other parts of your system, ensuring smooth and efficient operation.
Code Link : github.com/The...
Ollama Github - github.com/oll...
SQL Agent with Llama 3(With Ollama Installation in Local) - • Build an SQL Agent wit...
#dynamicfewshotprompting #sqlagent #llama3 #langchain #ollama #customtools #customagent #fewshotprompting #sql #database #langchain #machinelearning #nlp

Опубликовано:

 

27 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 10   
@umeshtiwari9249
@umeshtiwari9249 Месяц назад
Thanks for such nice tutorial on complex topic
@GordonShamway1984
@GordonShamway1984 Месяц назад
very nicely explained. You helped me a lot, thank you!
@NillsBoher
@NillsBoher Месяц назад
Great!!!! Thanks for sharing your knowledge! However I want to ask it the prompt is not too long for the context of ollama3?
@theailearner1857
@theailearner1857 Месяц назад
Not at all. Llama 3 has context length of 8192 while the prompt shown in the video varies from 450 to 500 tokens only.
@MScProject-u9n
@MScProject-u9n Месяц назад
How can I run it in colab instead of local Environment?
@MScProject-u9n
@MScProject-u9n Месяц назад
Can you also provide us the source code
@theailearner1857
@theailearner1857 Месяц назад
You can check out this video to run Ollama-based models on Google Colab, after which the dynamic few-shot prompting steps can be easily implemented. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-XDvTt_TOewU.htmlsi=RkjXX-jO3VSA08Em
@theailearner1857
@theailearner1857 Месяц назад
Code Link : github.com/TheAILearner/Langchain-Agents/blob/main/Dynamic%20Few-shot%20Prompting%20with%20Llama%203.ipynb
@MeTuMaTHiCa
@MeTuMaTHiCa Месяц назад
İt will be good when this works with cloud
@MeTuMaTHiCa
@MeTuMaTHiCa Месяц назад
By the way thx for good ai work
Далее
MySQL database with PandasAI & Ollama & Streamlit
16:03
LLM RAG - Chat with MYSQL using Streamlit
43:12
DHH discusses SQLite (and Stoicism)
54:00
Просмотров 59 тыс.
Build an SQL Agent with Llama 3 | Langchain | Ollama
20:28
Llamafile: Local LLMs Made Easy
6:27
Просмотров 8 тыс.
LlamaFile: Increase AI Speed Up by 2x-4x
8:43
Просмотров 9 тыс.