Тёмный

LangChain Tutorial (JS) #7: Long Term Conversation Memory 

Leon van Zyl
Подписаться 20 тыс.
Просмотров 3,8 тыс.
50% 1

#openai #langchain #langchainjs
The Memory modules in Langchain make it simple to permanently store conversations in a database, so that we can recall and continue those conversations in the future - just like the conversations in ChatGPT.
📑 Useful Links:
Langchain JS Chat Memory Docs: js.langchain.com/docs/integra...
Source Code: github.com/leonvanzyl/langcha...
Upstash: upstash.com/?Leon_...
☕ Buy me a coffee:
www.buymeacoffee.com/leonvanzyl
💬 Chat with Like-Minded Individuals on Discord:
/ discord
🧠 I can build your chatbots for you!
www.cognaitiv.ai
🕒 TIMESTAMPS:
00:00 - Introduction to Memory
00:47 - The Problem
01:40 - Project Setup
03:30 - BufferMemory
05:33 - ConversationChain
08:43 - Long Term Memory
08:56 - Adding Upstash Redis Memory
10:50 - Create Upstash Account
12:41 - LCEL RunnableSequence Approach
17:31 - Save Context to Memory

Наука

Опубликовано:

 

17 июл 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 30   
@leonvanzyl
@leonvanzyl 5 месяцев назад
Forgot to mention that you can add memory to an agent in the exact same way. The AgentExecutor class also accepts the memory property. The rest is exactly the same 😎.
@nomad4691
@nomad4691 5 месяцев назад
I"m blown away you don't have more subs. Following your Finance Tracker, nextJS, and now your LangChain series. All relevant to my career, and you have such a great way to explain and demonstrate.
@leonvanzyl
@leonvanzyl 5 месяцев назад
Thank you!
@zmeireles68
@zmeireles68 5 месяцев назад
Awesome video Leon. A bit confusing the runnables part, but I guess I'll have to dig in the langchain docs.
@leonvanzyl
@leonvanzyl 5 месяцев назад
Honestly, I prefer the first method. For completeness sake, I had to introduce RunnableSequence.
@mrinank
@mrinank 2 месяца назад
please continue this series
@TheWhoIsTom
@TheWhoIsTom 3 месяца назад
Dedicated videos about runnable sequence would be nice
@divyamjain8804
@divyamjain8804 Месяц назад
Please continue this series. 🥹🥹
@leonvanzyl
@leonvanzyl Месяц назад
Will do.
@michaeldavidgarcia5998
@michaeldavidgarcia5998 3 месяца назад
Thanks for these videos Leon, amazing job !!! I wanted to ask you, how can I implement this memory, with the documents, like the tutorial in the video 5, working with for example 'historyAwareRetriever' and my own documents ??
@leonvanzyl
@leonvanzyl 3 месяца назад
Most of these chain take memory as input.
@michaeldavidgarcia5998
@michaeldavidgarcia5998 3 месяца назад
@@leonvanzyl thanks for replying Leon! I mean neither createStuffDocumentsChain, createHistoryAwareRetriever, createRetrievalChain, take memory as input, can't find an example in the documentation, to do it with my own documents, do you know how. Maybe I have to create a separate database, select the rows and start creating an Human and AIMessage ?
@Jaimie-C
@Jaimie-C 4 месяца назад
Thank you for the great video! I think a video on runnable sequences would be good.
@leonvanzyl
@leonvanzyl 4 месяца назад
You're welcome 🤗
@Jaimie-C
@Jaimie-C 4 месяца назад
also maybe do a video on connecting this with a html/css/javascript ui@@leonvanzyl
@verticalstatue
@verticalstatue 3 месяца назад
I agree. Runnable sequence pls
@judgebot7353
@judgebot7353 4 месяца назад
How to integrate retrieval chain that we created in previous video-5 with conversational Chain ? e.g. const HistoryAwareRetrievalChain = await createRetrievalChain({ retriever: historyAwareRetriever, combineDocsChain: documentChainChat, });
@ehiaig
@ehiaig 3 месяца назад
Thanks for this video, it really simplifies langchain. However I keep getting this error "Error: output values have 1 keys, you must specify an output key or pass only 1 key as output" from the saveContext line. Would really appreciate any suggestion you have. Thanks alot async function callChain(userInput) { const res = await chainHistory.invoke( { input: userInput }, { configurable: { sessionId: String(chatId) } } ) await memoryInstance.saveContext({ input: userInput }, { output: res.content }) return res }
@ahmadalmasri1583
@ahmadalmasri1583 3 месяца назад
thanks
@leonvanzyl
@leonvanzyl 3 месяца назад
You're welcome!
@thijspeters4178
@thijspeters4178 5 дней назад
Haven't watched the series yet, but wanted to know or if it will it be easy to implement this into a web application if i finish this?
@leonvanzyl
@leonvanzyl 3 дня назад
Absolutely! Are you using Nextjs? You can easily add this code to an API or server action.
@michaeldavidgarcia5998
@michaeldavidgarcia5998 3 месяца назад
Please dedicated video for Runnables :D
@jonymnimonik-ff7dg
@jonymnimonik-ff7dg Месяц назад
Yes
4 месяца назад
The video is great! I would like to know how to use the stream function of the chain. usually, I use: const outputParser = new StringOutputParser() const chain = prompt.pipe(model).pipe(outputParser) let stream = await chain.stream({ question: "what are structs in Golang?", }) for await (const chunk of stream) { process.stdout.write(chunk) } But I understand that ConversationChain has no pipe method I tried with a runnable sequence, but it does not work (no response)
4 месяца назад
ok, I think it was because of my prompt, and now with the runnables and RunnableWithMessageHistory it works again, your videos are great
4 месяца назад
this is what I did: const prompt = ChatPromptTemplate.fromMessages([ SystemMessagePromptTemplate.fromTemplate( "You are a TV series expert. Make short answer only" ), new MessagesPlaceholder("history"), HumanMessagePromptTemplate.fromTemplate( `Question: {question}` ) ]) const outputParser = new StringOutputParser() const chain = prompt.pipe(model).pipe(outputParser) const messageHistory = new ChatMessageHistory(); const chainWithHistory = new RunnableWithMessageHistory({ runnable: chain, getMessageHistory: (_sessionId) => messageHistory, inputMessagesKey: "question", historyMessagesKey: "history", }) const config = { configurable: { sessionId: "1" } }; let stream = await chainWithHistory.stream({ question: "Who is James T Kirk?" }, config)
@judgebot7353
@judgebot7353 4 месяца назад
Response from agents are not quite accurate. I liked retrievalChain approach much better . I just want to add long term memory . Can you help me in that pls ? const HistoryAwareRetrievalChain = await createRetrievalChain({ retriever: historyAwareRetriever, combineDocsChain: documentChainChat, });
@nocnydrwal9499
@nocnydrwal9499 4 месяца назад
Please create video about runnables.
@leonvanzyl
@leonvanzyl 4 месяца назад
Will do