Тёмный
No video :(

Langchain Memory Model | How can LLM AI hold a ChatGPT-like conversation? 

DevXplaining
Подписаться 4,2 тыс.
Просмотров 4 тыс.
50% 1

In this video, I'll cover Langchain Memory API, using ConversationBufferMemory and ChatMessageHistory as an example. I'll share some of my thoughts on why this is cool and essential to learn for a developer. Code examples I show and run are from Langchain tutorials, so it's sufficient to follow the links below to keep up.
As always, do show the love by clicking that like button and subscribing to my channel if this kind of content interests you. Also, feel free to comment, share the link to my videos, or request future content.
Here are the links covered in the video:
- • Getting Started With L... (Getting started with Langchain)
- • How to run ChatGPT in ... (GPT4All a free local model)
- • Chat with your blogs |... (Chat with your blogs via Langchain)
- • How To Create ChatGPT ... (Build an OpenAI Virtual assistant with speech interface)
- python.langcha...
- api.python.lan...

Опубликовано:

 

28 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 16   
@phil.d6449
@phil.d6449 Год назад
I recently began with Langchain and your videos are well explained, thank you. We are so early.
@DevXplaining
@DevXplaining Год назад
Thank you, I agree with that. Langchain is not perfect but it's a great learning tool right now.
@binstitus3909
@binstitus3909 7 месяцев назад
How can I keep the conversation context of multiple users separately?"
@DevXplaining
@DevXplaining 7 месяцев назад
Hi, very good question! You would need to do that outside Langchain, minimal solution would be array/list of user identities along with their history, but that kind of feature probably also would pull need for authenticating users, and storing the chat context more permanently. So traditional development work. For purposes of LLM model, you simply pass in the relevant context (chat history for current user)
@Xanderfied
@Xanderfied 8 месяцев назад
Why doesnt Open AI implement this feature with ChatGpt? They could set the API store conversation in the end users HDD, under a temp folder. It makes so much sense, I dont get it.
@DevXplaining
@DevXplaining 8 месяцев назад
Well, there are ways to do that. However, ownership of data is still a bit superfluous, as in this model you still need to transmit the conversation back to model as context/input everytime. But this is awesome when you run a local model, and open source local lightweight models are becoming more awesome every day. ChatGPT has since I made this video rolled out the custom models, or GPTs, that allow you to do similar things online, packaged. Not quite the same but definitely aiming for similar use cases.
@Xanderfied
@Xanderfied 8 месяцев назад
@@DevXplaining it would no more superfluous than the information Open AI is already privy too. I know they say all the prompts you feed into GPT are never stored or read, manipulated, etc... but come on, they certainly have the ability to live monitor and even adjust any thing that comes through their servers despite the data security spiel they give the public karen crowd. All they have to do is tell the public that all chat history is stored solely client side and the api only has access to said data per session and only when granted permission from the end user. Make a EULA you agree too, un order to use the feature, stating such and problem solved.
@Xanderfied
@Xanderfied 8 месяцев назад
@@DevXplaining and yes ive seen the custom gpt3 models that have the chain feature, but i think something built in is not only a good idea, but if AI is to move forward, inevitable
@brezl8
@brezl8 Год назад
great!!
@DevXplaining
@DevXplaining Год назад
Thank you! :)
@aftermath7
@aftermath7 Год назад
Lol make a video on how to earn with chatgpt
@DevXplaining
@DevXplaining Год назад
Haha, cannot, still dirt broke here :)
@aftermath7
@aftermath7 Год назад
@@DevXplaining be my mentor?🛐💀
@valberm
@valberm Год назад
Are you an AI?
@DevXplaining
@DevXplaining Год назад
As an AI language model I can no more confirm than deny questions related to my identity.
@computadorhumano949
@computadorhumano949 Год назад
@@DevXplaining kkkkkkk
Далее
Advent of Code 2023 | Advent Calendar For The Coders
18:32
Memory in LLM Applications
16:16
Просмотров 7 тыс.
Python RAG Tutorial (with Local LLMs): AI For Your PDFs
21:33
LangChain: Giving Memory to LLMs
15:48
Просмотров 21 тыс.
Have You Picked the Wrong AI Agent Framework?
13:10
Просмотров 62 тыс.
Build Chat AI apps w/ Streamlit + LangChain
32:11
Просмотров 18 тыс.