I have done your nlp deep learning one shot and I will say you have done really great work that you increased my interested in this field. Thanks for this video too. - Somesh Panchal, Pune.
Excellent video on RU-vid! No other videos can match such great things on planet as far as AI learning is concerned! I need to know minimum hardware configuration to run all these program! 1. Can I run the program without NVIDIA graphics processor! If at all I have to purchase, what should be the configuration as there is no such requirements.txt for Hardware ! Kindly let us know, if possible make one video simpley stating LLM vs Hardware Configuration and Price as well as renting remote VPS for deployment!
@krish please try to explain some little things like why u used chatpromptemplate.from_template why other project u have used .from_messge otherwise it's type of confusion not for all but those who we are poor students
Hello Krish, I'm currently working on a PostgreSQL query using LangChain. I'm interested in obtaining the output in Excel format. Could you please advise on how to achieve this?
Hi Krish - a small bug in your code because of which it is not generating a poem - instead of prompt|llm it should be prompt|model for the poem routes add_routes( app, prompt2|llm, path="/poem"
Can you make a video on how to adapt an LLM model to its specific use case? Is fine-tuning the right approach, or should one tie the model to a database (RAG) that provides the necessary information? In my case, I'm currently trying to integrate a 'learning coach' chatbot within a Moodle course. Naturally, it should know the contents of the Moodle course to be able to provide targeted help...
Great video. Clear and crisp explaination of the concept and code. Can we use this use case for the comparing different LLM model response? or somthing else.
Hi Kris. plz explain the reason in details for every line of code. Explaining on a high level keeps confusing. Like the add_routes section - you should explain it in more detail.
Hi sir, I'm not able to use 'gemma' instead of 'llama2', don't know why?, I'm able to run llama2 successfully, Is there any changes to be done to use gemma. I'm able to run from cmd line using ollama! Please help.
And nothing can match th way you teach things (explaining each every library , their use , Diagrams ). These are really very helpful Krish. Really appreciate creating all these tutorial. Please keep adding more to the playlist :)
There is conflict coming in langserve, langsmith, langchain_core version. I am getting errors due to this library. Please share the version of this library to install. Thanks
Hi Krish, First of all want to great work on videos really learning a lot from your videos. I have question, can you please share how can we deploy langgraph using langserve? Thanks
Hi Krish, I wanted to share some feedback. I'm a regular viewer of your videos, and I often find it challenging to locate a specific video when I don't remember its title. Implementing a consistent naming convention or series titles could significantly improve the searchability of your content. I truly value the insights your videos provide.
Hey krish , there is an error , whenever I try to run the app.py file, I go to the url, then do /docs, then there appears that the API load has failed, please make one non-openai version of this video as well, since those not having tokens in openai are unable to execute what is provided in this video, I think that is the reason , since I did not add os.environ, to my code since I did not have tokens in my openai account.
Krish, can you make a video on how to get an internship in Generative AI field, I have worked a lot on genai, and currently also learning, now wanted some real world experience in the field.
maine sab sikha hua hai, sabse important data analytics aur ml hai jobs perspective se, par sabse kam competition generative ai mein hai kyuki baaki sab cheeze seekhne k baad genai seekhte hai directly nhi. GenAI ki prereq NLP hai, aur NLP ki prereq- ML/DL/etc.